Building a robot bartender that mixes the perfect drink is a concrete project that can be boiled down to an exact science of recipes, ingredients and measurements. Building a robot that knows when a pub patron wants another drink — that’s little more abstract.
Nonetheless, researchers at Bielefeld University in Germany are working on a Joint Action in Multimodal Embodied Systems (James) robot that can interpret gestures and body language to determine who is ready for another pint.
Researchers placed cameras in German and English pubs to study how customers used physical cues and body postures to flag the bartender’s attention. Such behaviors included holding up money or wallets and waving, but most people just stood perpendicular to the bar and looked at the bartender.
Led by professor Jan de Ruiter of the Psycholinguistics Research Group, the research team is using that data and a variety of software packages to program James to recognize gestures and voices via Kinect sensor technology. So far, the researchers have established the one-armed, tablet-headed robot’s ability to determine a customer’s posture, movements and actions in real time.