MIT Technology Review Subscribe

Google Brain Wants Creative AI to Help Humans Make a “New Kind of Art”

The search giant’s AI research division has developed a deep-learning tool to produce music and art with humans in the loop.

Machine-learning algorithms aren’t likely to put painters or singer-songwriters out of work anytime soon, to judge from their body of work to date. But Google Brain is developing tools that pair artists with deep-learning tools to develop novel artwork together, said Douglas Eck, senior staff scientist at the search giant’s artificial-intelligence research division, during the MIT Technology Review’s EmTech Digital conference on Tuesday.

He hopes the platform, called Magenta, will allow people to produce completely new kinds of music and art, in much the way that keyboards, drum machines, and cameras did. Eck said that Magenta could serve a role analogous to that of Les Paul, who helped develop the modern electric guitar. But Eck said they want to keep artists in the loop to push the boundaries of the new tool in interesting ways, like a Jimi Hendrix who flips it upside down, bends the strings, and distorts the sound.

Advertisement

“The fun is in finding new ways to break it and extend it,” he said.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.

Subscribe now Already a subscriber? Sign in
You’ve read all your free stories.

MIT Technology Review provides an intelligent and independent filter for the flood of information about technology.

Subscribe now Already a subscriber? Sign in

Eck, previously an associate professor of computer science at the University of Montreal, said he was drawn to the project as a “failed musician” himself. As a guitarist and pianist who used to play “post-punk-folk” in coffee shops, he had fans numbering in the “dozens,” he said.

Google Brain is continually trying to improve Magenta’s algorithms for creating songs and producing art transfers from images. On stage, Eck played a computer-generated piano tune that got progressively more listenable as the tool was given more rules to follow, ultimately generating a phrase that might have the early makings of a jingle for a toothpaste ad.

A critical challenge now is developing better human interfaces for the technology. The researchers started with the equivalent of a command-line prompt, but Eck said he wants to get closer to the “naturalness” of a guitar stomp pedal. He hopes the project will attract talented musicians and coders to continue to enhance the tool and apply it in new ways.

“I don’t think that machines themselves just making art for art’s sake is as interesting as you might think,” he said. “The question to ask is, can machines help us make a new kind of art?”

This is your last free story.
Sign in Subscribe now
The Download

Your daily newsletter about what’s up in emerging technology from MIT Technology Review.

Please, enter a valid email.
Privacy Policy
Submitting...
There was an error submitting the request.
Thanks for signing up!

Our most popular stories

Advertisement