Shimon — a four-armed marimba playing robot — has been around for years, but its developers at Georgia Tech have recently taken this futuristic musical machine to the next level. Using deep learning, the robot can now study large datasets from well-known musicians, and then produce and perform its own original compositions.
Shimon was originally developed by Gil Weinberg, director of Georgia Tech’s Center for Music Technology. Under its original programming, the robot was capable of improvising music as it played alongside human performers, using an “interestingness” algorithm to make sure it wasn’t just copying its bandmates. But now, thanks to the efforts of Ph.D. student Mason Bretan, Shimon has become an accomplished composer, capable of autonomously generating the melodic and harmonic structure of a song. And you know what? Shimon’s songs are actually quite good!
To turn Shimon into an autonomous music-making machine, the researchers turned to artificial intelligence. Using deep learning, the bot studied a database of nearly 5,000 pre-existing songs, including compositions by Beethoven, the Beatles, Miles Davis, and Lady Gaga. The robot was also given access to more than two million musical motifs, riffs, and licks of music. To kickstart a composition, Bretan would offer Shimon a starting “seed” of music that included the first four measures. From there on, it was all Shimon.
“Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece,” said Bretan in a press release. “Shimon’s compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments.”
The researchers say it’s the first time a robot has used deep learning to create music. You can listen to Shimon’s first two compositions, which are roughly 30 seconds in length, right here:
Previously, Shimon was only able to play monophonically (only one note at a time), but now it can play harmonies and chords—and it’s beginning to compose more like a human. Instead of just focusing on the next note, Shimon is now taking a holistic view of composition, devising meaningful measures and higher-level musical semantics.
“This is a leap in Shimon’s musical quality because it’s using deep learning to create a more structured and coherent composition,” said Weinberg. “We want to explore whether robots could become musically creative and generate new music that we humans could find beautiful, inspiring and strange.”
Listening to the compositions, it’s clear that Shimon is a good student, drawing inspiration from the extensive database it’s been given. “[The pieces] sound like a fusion of jazz and classical,” said Bretan. “I definitely hear more classical, especially in the harmony. But then I hear chromatic moving steps in the first piece — that’s definitely something you hear in jazz.”
I gotta say, this is pretty neat. The bot is producing meaningful, original music largely without human intervention. It’s also a highly innovative way of creating new compositions (the guys from Kraftwerk would surely love this). But to state the obvious, this machine is still light-years away from producing music that feels genuinely human. Simply put, Shimon’s music lacks a bit of soul. [Georgia Tech]
More Robots Posts:
“Once you have a Lego brick, what kind of castle can you build with it?”
And it's aptly named The Service Droid.
The Japanese telecommunications company already known for its less terrifying robots, like Pepper, which might soon be getting some impressive upgrades.
Researchers are suggesting that could be a great step towards improving the algorithms, even if they aren’t out to murder us.