Ultimately, iCub could teach us more about human cognition.

Most robots are programmed. But a new humanoid robot called the iCub will soon be learning very much like a child does.

Steve Levinson and his team at the University of Illinois recently received the iCub from the RobotCub project, a five-year-long cognition study funded by the European Commission that started in 2004. Levinson's team is the only one in North America to receive this robot.

Long term, their studies could yield a robot capable of making independent decisions about household chores or care-giving. But the more immediate goal is to learn about human learning.

The iCub has more enhanced sensors and better motor skills than any other humanoid robot available for research, said Levinson.

"The iCub is by far the most elaborate humanoid out there," said Levinson.

He and his team have a long list of ambitious goals for the robot, but are currently working on getting the basics down.

"We will be working on fine motor control," said Levinson.

His team has taught the robot to perform basic motor skills, like reaching, grabbing, listening and moving its head and eyes. Levinson said that these skills must be perfected before they can begin their planned projects.

He has four major projects that his Ph.D. students will focus on: teaching the robot to juggle, to walk, talk, and create memories.

The purpose of all of these projects is to understand how the iCub learns about its external world, according to Levinson.


"This is a more or less general understood procedure that has not been done with humanoid robotics," said Levinson.

He said his team will show it things they want it to learn (i.e., juggling), and the iCub will learn how to do it through experience. The team will show it how to juggle, then give the robot either positive or negative feedback based on whether or not they think it's learning correctly.


The so-called legged locomotion project will involve teaching the robot how to walk. Levinson and his team want to learn about the iCub's mental representation of its external world based on how it learns to move.

Currently, a harness is holding up the robot, but the iCub believes it is standing on its own. Levinson and his team first hope to get it to stand without the harness by making it understand the difference between having the harness on or off.


The language imitation project -- which is unique to Levinson's lab -- will complement the walking project. Speech is choreographed much like locomotion, according to Levinson, because movement is seen as humans' earliest attempts at communication.


The self-organizing cognitive maps project will involve the iCub learning a mental representation of the real world. An object will be held up close to its face, allowing it to store an image of that object in its database, similar to a memory. It will then be able to recognize that object in its environment.

The iCub may be new to the United States, but researchers in Europe have studied the robot for years. David Vernon is a freelance research scientist who has worked with the iCub since 2004.

"It's special because of the fact that it's got such a rich set of sensory motors," said Vernon, who helped design how the robot was able to think and function.

There are 53 motors, allowing the robot to have almost the same amount of movement and flexibility as a child. The head has seven of these motors, allowing it to move its head around to see.

"It's the best binocular head I have ever come across," he said.

Vernon also said that it is the only research robot available that brings many different systems together -- no other research robot has two hands, two arms, a torso and a head.

Levinson said the iCub is vastly more sophisticated than what his team has built to date. The lab's three robots, known as Trilobots, have limited capabilities in movement and learning. The team plans to transfer over basic mathematical systems from their older robots to the iCub.