Keirzo is a vaguely humanoid-shaped construction with ear-like sound inputs, a speaker where a mouth might be and a range of motor-driven robotic ‘arms,’ some fitted with drumsticks, others with a small round rubber mallet.
Keirzo's music - including rap lyrics - is produced entirely by the robot, says Macquarie University Research Fellow Dr Richard Savery, who is a developer of artificial intelligence and robotics in the Department of Media, Communications, Creative Arts, Language and Literature, and a professional saxophonist, clarinettist and flautist.
"It listens, it plays, and behaves just like a collaborating musician on stage, responding to the human members of the band," Savery says.
"It's all AI underneath and a whole bunch of different deep learning elements.
"And while I don’t have to pay Keirzo – robots cost a lot to make and maintain and train. Everything is an expensive process."
Learning from humans
Keirzo’s ability to learn from human musicians is key to Dr Savery’s project.
"Through inbuilt microphones, it listens to drummers, and then it tries to adjust the way it plays," Dr Savery says.
Now that Keirzo has absorbed the recordings of a range of different drummers, it is practicing and merging various techniques to develop its own playing style, and then it practices what it has learned with human musicians, so it can build its knowledge further.
Dr Savery says that Keirzo’s ability to practice, learn from musicians and interact with others is what makes it a robot musician (rather than a musical robot).
People worry that robots might take over or make musicians redundant, but I think it’s the reverse, more opportunities will open up.
The robot's rapping ability has also been learned from a database of human recordings, this time from rappers who all volunteered their work for Savery to use to train Keirzo.
Keirzo is continually in learning mode, says Dr Savery, who has worked on the project for over a year.
He says that the music datasets that train his robot are custom, smaller deep learning models, which are quite different in both purpose and construct from the large language models that drive AI tools like chatGPT.
For starters, lyrics use devices like figurative language and metaphor, often conveying an emotional experience rather than being strictly literal, and also rely strongly on rhyme and rhythm.
Dr Savery says he’s also very conscious of the need to train Keirzo only on lyrics where creators have given permission.
Robots and all that jazz
Dr Savery says that his own curiosity about how music works led him to develop Keirzo, and the robot’s development has been strongly influenced by his background as a jazz musician.
Intelligent: Dr Savery, pictured, says Keirzo’s ability to learn from human musicians is a key element of his project.
"I was coding and playing music very young," says Dr Savery.
He found his musical home when switching to saxophone, where he learned to improvise and eventually studied jazz performance as a saxophonist at the Sydney Conservatorium of Music.
Dr Savery then completed a Masters degree at the University of California followed by a PhD under Professor Gil Weinberg, who leads the Robotic Musicianship group at Georgia Tech Center for Music Technology.
He says the process of training a robot to behave like a musician has given him more insight into how humans create music – and the next step could involve democratising the music-making process.
"I really like the way that AI can make it possible for everyone, no matter what their skill level, to make music, so that everyone in society can play music together."
He believes the technology could change the way we make and interact with music.
Take a breath
Dr Savery is currently exploring how humans interact with robots during live rehearsals – and has made some adjustments so that Keirzo appears to ‘breathe’ or take a breath while rapping.
The way the robot moves physically can impact how people interact with it, he says.
“I’ve noticed that if the robot is on stage and stops moving, people can think it's broken. So I added the breathing effect, just so people know it's not broken. Now that has become an interesting question – how does the illusion that the robot can breathe change the way people interact with it?”
Dr Savery says that one of the questions he is often asked is – why use a robot to make music?
“People worry that robots might take over or make musicians redundant, but I think it’s the reverse, more opportunities will open up,” he says.
"The history of music and creative arts shows that it has always been changing, and often change has been driven by technology," he says.
"Robots and AI are a new technology that are going to change how we make music and how we listen to music, and I think it’s a fascinating thing to study."
Dr Richard Savery is a Macquarie University Research Fellow (MQRF), Department of Media, Communications, Creative Arts, Language and Literature