If you're feeling depressed, chances are it's written all over your face. So much so, that even a computer-generated psychologist can tell.
Researchers at the University of Southern California (USC) are developing SimSensei, a Kinect-driven avatar system capable of tracking and analyzing telltale signs of psychological distress.
Created by Stefan Scherer and colleagues from USC's Institute for Creative Technologies, the avatar psychologist uses facial recognition technology and a depth-sensing camera to read a person's facial movements, body movements, posture, linguistic patterns and acoustics to screen for depression.
BLOG: Avatar Hosts Your Internet Feed
"Broad screening is done by using only a checklist of yes/no questions or point scales, but all the non-verbal behavior is not taken into account," Scherer told New Scientist. "This is where we would like to put our technology to work."
For their study, researchers used Kinect to record interviews, as volunteers responded to questions asked by "Ellie," the avatar psychologist. Depressed subjects who participated in the study smiled less, averted their gaze and fidgeted more than those who were not depressed. All of which, SimSensei's system can detect.