Robot Paints Its Feelings
A microphone captures sound and a genetic algorithm transforms the sounds into a code that drives the robot's paintbrush.
Artist and composer Ben Grosser, who is working on an MFA in New Media at the University of Illinois, has developed a robot that's able to hear the world around it and use those sounds to create a painting.
The Interactive Robotic Painting Machine has a microphone that captures surrounding sound and a genetic algorithm designed to transform those sounds into computer code ultimately drives the robot's paintbrush in three dimensions, controlling how much paint to put on the brush and how much pressure to apply to canvass.
The sounds can come from people in the room or, when people aren't around, can come from the machine itself. In a related project called HeadSwap, the robotic painter collaborated with violinist Benjamin Sung, who played music composed by Zack Browning. At the same time that Sung was watching the machine paint and using what he saw to inform his music, the machine was listening to Sung play and using that to inform its art.
On his website, Ben says, "It is important to understand that what the machine paints is not a direct mapping of what it hears. Instead, the system is making its own decisions about what it does while being influenced by others."
Visit Grosser's website to see the video of Sung playing with the robot and to see the kinds of paintings the robot produced. Pretty cool.