We’ve heard it from our electronic music hating friends many times before… ‘Ah it’s all just soulless computer music anyway’
If Google’s Project Magenta takes off, perhaps our future favourite beats will be?
Probably not. But it is a mad, slightly scary and really interesting project launched by Google’s Brain Team (who’ve developed applications, algorithms and software for things like Deep Dream, Translate and various image recognition programs)
Revealed late last month at Moogfest, a music/art/technology festival in North Carolina, then officially announced earlier this month, Project Magenta uses machine learning to analyse how software interprets the data that its fed to develop its own version through analysis and reproduction.
The end result is “neural” computer network that aims to loosely mimic the functionality of the human brain. So, just as we’d revise or read or digest information in order to learn, the network is exposed to hundreds of examples of what it is meant to produce, then able to predict or create its own next step in the sequence. In the case of Project Magenta, the hundreds of examples it’s fed are musical compositions… As it understands the rules and structure of music it will then come up with its own composition.
Brain Team have already revealed Magenta’s debut release. It was given four notes, this is what it came back with. We’re unsure of Magenta added the breakbeats itself. Probably not.
Producers: don’t pack up your DAWs quite yet, Magenta is still very much in its infancy. Plus, Google Brain Team reckon the AI is for musicians, rather than taking your jobs. Explaining on the Project Magenta site, project leader Douglas Eck explains that it could be use to aid learners or as a performance tool to play with or alongside.
“We don’t know what artists and musicians will do with these new tools, but we’re excited to find out,” states Douglas. “Daguerre and later Eastman didn’t imagine what Annie Liebovitz or Richard Avalon would accomplish in photography. Surely Rickenbacker and Gibson didn’t have Jimi Hendrix or St. Vincent in mind.”
Other potential benefits of Magenta include how it might interpret the work of famous composers (noticing deep patterns or signatures that musicologists have yet to discover) and its open-source community nature that encourages coders, musicians and developers to get involved in the project.
That said, you’re not likely to see a Magenta tune up on UKF any time soon and we firmly believe that the soul in electronic music comes from the human controlling it and the most exciting sounds are often/if not always created by accidents, errors or good old experimentation into the unknown (see this Bad Company interview for a great discussion on the perfection of imperfection) this is still an interesting development in the worlds of both music and AI.
And let’s be honest, there are a fair few DJs out there who would benefit from a robot replacement, right?
The robots takeover continues….