Turns out there's actually a term for this squeamishness. It's called the "uncanny valley principle" and it's the reason why people like myself feel jittery around realistic robots that just aren't realistic enough. Many moons ago, roboticists found that the more humanlike a robot becomes, the more uneasy humans get, dipping into the uncanny valley agitation.
However, to overcome this unease, robots must look less like expressionless corpses. Taking a stab at doing just that is Nicole Lazzeri at the University of Pisa in Italy. She and her colleagues have created a facial animation engine that attempts to give more realistic facial expressions to a humanoid robot dubbed FACE.
Lazzeri's team used 32 motors under FACE's polymer skin and around its skull and upper torso to more closely mimic real facial expressions. The team also used a 30-year-old system called Facial Action Coding System (FACS) that codes facial expressions to correspond with anatomic muscle movements.
Some of the emotions FACE displayed were anger, disgust, happiness, sadness, surprise and fear. The team tested the robot's accuracy by asking five autistic and 15 non-autistic children to identifly FACE's emotions. A psychologist was also on hand to perform the same emotions. Both groups correctly identified anger, sadness and happiness but had trouble identifying disgust, fear and surprise.
BLOG: 'Popchilla' Robot Could Help Autistic Kids
While FACE might be a step up from the wax museum, I'll be honest, I'm still feeling a bit of that 'uncanny valley.' Check out the following video and let me know what you think.
via Gizmodo, New Scientist Credit: YouTube screen grab