Behind Magenta, the tech that rocked I/O



  • On the second day of I/O 2019, two bands took the stage—with a little help from machine learning. Both YACHT and The Flaming Lips worked with Google engineers who say that machine learning could change the way artists create music.

    “Any time there has been a new technological development, it has made its way into music and art,” says Adam Roberts, a software engineer on the Magenta team. “The history of the piano, essentially, went from acoustic to electric to the synthesizer, and now there are ways to play it directly from your computer. That just happens naturally. If it’s a new technology, people figure out how to use it in music.”

    Magenta, which started nearly three years ago, is an open-source research project powered by TensorFlow that explores the role of machine learning as a tool in the creative process. Machine learning is a process of teaching computers to recognize patterns, with a goal of letting them learn by example rather than constantly receiving input from a programmer. So with music, for example, you can input two types of melodies, then use machine learning to combine them in a novel way.

    IO_19_Wed_Evening_11769.jpg

    Jesse Engel, Claire Evans, Wayne Coyne and Adam Roberts speak at I/O.  

    But the Magenta team isn’t just teaching computers to make music—instead, they’re working hand-in-hand with musicians to help take their art in new directions. YACHT was one of Magenta’s earliest collaborators; the trio came to Google to learn more about how to use artificial intelligence and machine learning in their upcoming album.

    The band first took all 82 songs from their back catalog and isolated each part, from bass lines to vocal melodies to drum rhythms; they then took those isolated parts and broke them up into four-bar loops. Then, they put those loops into the machine learning model, which put out new melodies based on their old work. They did a similar process with lyrics, using their old songs plus other material they considered inspiring. The final task was to pick lyrics and melodies that made sense, and pair them together to make a song.

    Music and Machine Learning (Google I/O'19)

    Music and Machine Learning Session from Google I/O'19

    “They used these tools to push themselves out of their comfort zone,” says Jesse Engel, a research scientist on the Magenta team. “They imposed some rules on themselves that they had to use the outputs of the model to some extent, and it helped them make new types of music.”

    Claire Evans, the singer of YACHT, explained the process during a presentation at I/O. “Using machine learning to make a song with structure, with a beginning, middle and end, is a little bit still out of our reach,” she explained. “But that’s a good thing. The melody was the model’s job, but the arrangement and performance was entirely our job.”

    The Flaming Lips’ use of Magenta is a lot more recent; the band started working with the Magenta team to prepare for their performance at I/O. The Magenta team showcased all their projects to the band, who were drawn to one in particular: Piano Genie, which was dreamed up by a graduate student, Chris Donahue, who was a summer intern at Google. They decided to use Piano Genie as the basis for a new song to be debuted on the I/O stage.

    Google AI collaboration with The Flaming Lips bears fruit at I/O 2019

    Piano Genie distills 88 notes on a piano to eight buttons, which you can push to your heart’s content to make piano music. In what Jesse calls “an initial moment of inspiration,” someone put a piece of wire inside a piece of fruit, and turned fruit into the buttons for Piano Genie. “Fruit can be used as a capacitive sensor, like the screen on your phone, so you can detect whether or not someone is touching the fruit,” Jesse explains. “They were playing these fruits just by touching these different fruits, and they got excited by how that changed the interaction.”

    Wayne Coyne, the singer of The Flaming Lips, noted during an I/O panel that a quick turnaround time, plus close collaboration with Google, gave them the inspiration to think outside the box. “For me, the idea that we’re not playing it on a keyboard, we’re not playing it on a guitar, we’re playing it on fruit, takes it into this other realm,” he said.

    During their performance that night, Steven Drozd from The Flaming Lips, who usually plays a variety of instruments, played a “magical bowl of fruit” for the first time. He tapped each fruit in the bowl, which then played different musical tones, “singing” the fruit’s own name. With help from Magenta, the band broke into a brand-new song, “Strawberry Orange.”

    IO_19_Wed_Concert_13496 (1).jpg

    The Flaming Lips’ Steven Drozd plays a bowl of fruit.

    The Flaming Lips also got help from the audience: At one point, they tossed giant, blow-up “fruits” into the crowd, and each fruit was also set up as a sensor, so any audience member who got their hands on one played music, too. The end result was a cacophonous, joyous moment when a crowd truly contributed to the band’s sound.

    IO_19_Wed_Concert_13502 (1).jpg

    Audience members “play” an inflatable banana.

    You can learn more about the "Fruit Genie" and how to build your own at g.co/magenta/fruitgenie.

    Though the Magenta team collaborated on a much deeper level with YACHT, they also found the partnership with The Flaming Lips to be an exciting look toward the future. “The Flaming Lips is a proof of principle of how far we’ve come with the technologies,” Jesse says. “Through working with them we understood how to make our technologies more accessible to a broader base of musicians. We were able to show them all these things and they could just dive in and play with it.”



    https://www.blog.google/technology/ai/behind-magenta-tech-rocked-io/

Log in to reply
 

© Lightnetics 2024