CHAMPAIGN, Ill. — When jazz musicians improvise alongside each other, they interact in ways that are as much about feeling the music as about thinking. But suppose one of the musicians is a robot. Can it improvise a jazz solo in response to what its human partner is playing?
A University of Illinois researcher is designing a robot – actually a computer system – that will communicate with humans through jazz improvisation and provide insight into artificial intelligence and human-computer interaction.
The project – called MUSICA (MUSical Improvising Collaborative Agent) – is part of the Defense Advanced Research Projects Agency’s new Communicating with Computers program, which aims to “enable symmetric communication between people and computers,” where computers are collaborators just like people are. The jazz robot is an unconventional approach to the problem.
“It’s approaching this communication from a different perspective than linguistically, and maybe we can learn something new about computers and humans,” said Ben Grosser, a professor of new media in the School of Art and Design. Grosser is also affiliated with the National Center for Supercomputing Applications, where he is helping develop a new initiative in critical technology studies.
“We proposed that music improvisation is a novel space for investigating communication because it’s not just cognitive interaction,” Grosser said of the MUSICA project. “It’s also an embodied experience. You feel the music as much as hear it and think about it. You react.”
To build a software system that might be able to improvise music, Grosser and his team will create a knowledge database of canonical jazz solos. Their approach is to computationally analyze these solos using "image schema," a way that people understand their world using spatial concepts. For example, a musician might play "inside," meaning their notes fit within what the song's chord changes suggest, or "outside," meaning they're pushing the boundaries of what is appropriate.
“It’s a way of understanding what jazz musicians do and analyzing what happens between them,” said Grosser, who has a long history of performance and interest in jazz, has an MFA in new media and a master’s degree in music composition, and worked in the School of Music’s Computer Music Project during graduate school.
He’ll then add a performance system that will analyze what a human performer is playing in real time, including the beat, pitch, harmony and rhythm, and consult what it has learned about jazz solos to communicate musically in response. The system should “understand not just what should come next but when a response should happen in time,” according to the project proposal. Grosser hopes to have a system that will be able to perform a sophisticated “call and answer” musical response in a year.
“The ultimate goal is that a human performer should perceive what it hears from the system as musical communication,” he said. “I don’t expect our jazz robot to be a Miles Davis. Maybe it can be a high schooler, if we can really nail it.”
The project builds on previous work by Grosser, including an interactive robotic painting machine. The machine creates oil paintings while considering what it “hears” as input into the painting process.
Grosser said he is interested in not just artificial intelligence, but also the cultural, social and political effects of software, and the question of whether a computer system can truly make art.
The jazz robot project “is valuable for evaluating computational systems, but also in how we think about what art is. I like the questions it poses,” he said. “We’re increasingly interacting with intelligent systems. How do the designs of these systems change our interactions with them and with other humans?”
Working with Grosser on the five-year, $2.4 million project are Kelland Thomas and Clay Morrison, professors in the University of Arizona’s School of Information; and Colin Dawson, a math professor at Oberlin College.