Leave out zeros and ones
July 27, 2022

Essay: Do we want what the machines tell us we want?

generated recommendations, means: "Besieged by automated recommendations, we are left to guess exactly how they are influencing us, feeling in some moments misperceived or misled and in other moments clocked with eerie precision. At times, the computer sometimes seems more in control of our choices than we are".

When Ludwig von Beethoven died in 1827, he left some musical sketches for his 10th symphony and nothing more. In early 2019, Dr. Matthias Röder, the director of the Karajan Institute, an organization in Salzburg, Austria, that promotes music technology, was putting together a team to complete Beethoven’s 10th Symphony in celebration of the composer’s 250th birthday. He hoped an AI would be able to help fill in the blanks left by Beethoven. Ahmed Elgammal in the Conversation describes how he presided over the artificial intelligence side of the project, leading a group of scientists at the creative AI startup Playform AI that taught a machine both Beethoven’s entire body of work and his creative process. And - they did it" A full recording of Beethoven’s 10th Symphony is set to be released on Oct. 9, 2021, the same day as the world premiere performance scheduled to take place in Bonn, Germany – the culmination of a two-year-plus effort.

ReWrapped is a new AI which analyzes your Spotify favorites in order to discover new artists influenced by your musical taste, Engadget reports. It works by linking directly with your Spotify account and identifying the most popular music via the Spotify Wrapped feature. Once recognised, this innovative A.I. audio analysis engine then examines key elements of each audio track and compares them against the community of artists, with suggestions then made based on the similarities.

Georgia Tech’s Center for Music Technology has made a robot that's able to compose, play, and sing completely original songs. Shimon, the marimba-playing robot will release an album on Spotify next month featuring songs written (and sung) entirely by him, IEEE Spectrum reports. The plan was for the robot to go on tour as well; which it as well might, if it's exclusively robots in the audience, not just on stage. When Shimon started learning music, he got a huge database of music composed by humans - a dataset of 50,000 lyrics from jazz, prog rock, and hip-hop, but the key to Shimon’s composing ability is its semantic knowledge - the ability to make thematic connections between things, which is a step beyond just throwing some deep learning.

Earlier this month, an ad agency space150 shared a song and music video called 'Jack Park Canny Dope Man', credited to an A.I. called “Travisbott”, based on Travis Scott’s original music and lyrics. The song was lousy, but it raises a serious question: How do we deal with the sampling and reproduction of an existing artist’s musical likeness when someone completely unrelated stands to profit from it? Holly Herndon, who collaborated with A.I. on her last album, raised the issue: A.I. is getting better at sounding like human beings, so what will the humans controlling the A.I. do with that power? Herndon told the Fader that she saw something like Travisbott coming: “I think we're going to see a flood of automated compositions, people using neural nets to extract the logic from other people's work and a lot of appropriation. We're going to see big issues around attribution”.

Let’s say that streaming becomes the de facto method of music delivery/consumption for the DJ community – what might this mean? - Attack Magazine asks theoretically (although it might just end up that way). It means that digital mixers and players will be able to collect and collate information from DJs about what they play, by whom and for how long. Digital mixers will be able to harvest every single piece of information from their actual front panel controls, which opens the door to DJ ghosts. Does it even signal the end of DJing? Probably not, it might simply be that in the future everyone, even A.I.s, can be a DJ.

Most musicians’ jobs don’t seem to be going anywhere in the age of artificial intelligence, it is more likely we're nearing a period ripe for hybrid creativity - Consequence of Sound argues in an interesting article, naming numerous examples of musicians using AI as an intelligent instrument. Ai tools exist right now and enterprising musicians are currently using them in creative ways. This will only increase as more tools become available and more musicians experiment with them, ushering in a new phase of creative expression that incorporates an ever-evolving AI tool set that enables musicians to more fully express our humanity.

Hundreds of employers have been laid off last month by the world's biggest broadcast company, iHeartMedia. American conglomerate's chief executive Bob Pittman said the “employee dislocation” was “the unfortunate price we pay to modernize the company”. Laid-off employees blame the cuts on the company’s top executives, with some critics saying executives used the systems as scapegoats, hoping to distract from old-school failures, portray themselves as futuristic and avoid public outrage, according to the Washington Post. The company, which now uses software to schedule music, analyze research and mix songs, has called AI the muscle it needs to fend off rivals, recapture listeners and emerge from bankruptcy. iHeartMedia owns online iHeartRadio and more than 850 local stations across the United States.

Berliner Ollie Holmberg has created an AI that generated DJ and record names from Berlin's Hard Wax Record Shop, Electronic Beats reports. So, this is his explanation why he did it: “The end of the Anthropocene (near-term human extinction) our civilisation and the biosphere aren’t going to collapse evenly in space or at a single […]