Michele Rucci, Boston University
The vision in this humanoid robot more closely resembles that of the human eyeball, which is able to perform micro-movements thousands of times per second.
Robots are good at computational tasks like playing chess, figuring out bus routes or solving math problems, for example. But ask them to walk, talk or recognize everyday objects, and things quickly fall apart, researchers say. An emerging field known as of "neurobiological robotics" is looking for unique human or animal abilities that can be copied, turned into software, and replicated in order to make robots work better.
A group of the world's top researchers in this field are presenting several works-in-progress at the IEEE International Conference on Robotics and Automation this week in Hong Kong.
“We’re trying to make the robot brain more like human brain,” said Jeff Krichmar, professor of cognitive science at the University of California, Irvine. “The brain has incredibly flexibility and adaptability. If you look at any artificial system, it’s far more brittle than biology.”
Krichmar is experimenting with building neurotic robots that exhibit signs of obsessive-compulsive disorder, just like humans, or are afraid of open spaces. He’s doing this by making a robot act like a mouse in a cage.
“If you put a rodent in a room that is open and unfamiliar, it will hug the walls,” Krichmar said. “It will hide until it becomes comfortable, then it will move across the room. It will wait until if feels comfortable. We did that with a robot and made it so it was so anxious it would never cross the room.”
Krichmar’s team uses a rodent model and varying levels of dopamine and serotonin, the two brain hormones that control pleasure centers and well-being. The effects of the chemicals on the rodent are then replicated in the robot’s software, Krichmar explained.
“Were mimicking the action of the chemicals with equations,” he said. “We are doing mathematical models of brain or cognitive system, then putting that in software and it becomes the controller for the robot.”
LeCarl is one of many new robots programmed with learning patterns like humans and animals. It uses a smartphone to receive commands and sends back video and sensor readings.Cognitive Anteater Robotics Laboratory, UC Irvine
Why? Krichmar believes that making a robot exhibit fear or caution might help make it better decisions. A search-and-rescue drone, for example, might stay put during foul weather instead of taking a risk to complete its mission. There might be other times that it might be better to build a robot that doesn’t care what dangers it might face.
Krichmar has developed “Carl’s Junior” a sensitive robot that looks like a turtle with colored stripes across its shell. This therapeutic robot is being used at a nearby school to help with children on the autism spectrum who seem to respond well to an inanimate, yet responsive object.
In contrast, Michele Rucci at Boston University is working on making robots see better. He has found robots don’t have good perception not because they don’t have good cameras or sensors, but because they can’t make the normal micro-movements in the eyeball and head that people do thousands of times per second. So Rucci’s team figured out how to capture these movements and transfer them to a humanoid robot.
“As the robot is moving with subtle eye and head movements, it’s point of view is subtly changing, Rucci said. “It gives a cue about the three-dimensional structure of the environment. We use this to determine how far the objects are from the robot.”
Carl’s Junior is a sensitive robot that looks like a turtle. UC Irvine
Other teams are presenting some unique takes on robot development, such as teaching a robot to learn like a baby from the University of Plymouth (UK), beefing up the "brains" behind robot cognition, decision-making and movement.
Where is this field headed? “You will see robots with these capabilities actually doing things in the home or for search and rescue,” Krichmar said. “The time is right and its moving.”