Robots Might Gather Evidence Better Than Humans in Child Abuse Investigations
Human facial expressions and subtle body language can influence children's responses during investigative interviews — bias that might be avoided with the use of robots.
When she was abused as a child, Cindy Bethel felt like she didn't trust anyone enough to share her experience. Instead she confided in Barbie and her stuffed animals. Years later, she discovered that this is a common response in children who have been abused.
Bethel, now a forensic interviewer and an associate professor at Mississippi State University, wants to give children an alternative to talking with adult investigators. As director of the university's Social, Therapeutic, and Robotic Systems Lab, she's developing software for a robot that she hopes will be able to conduct interviews more effectively than humans.
"I was not sure if they would talk with the robot," she said, "but I felt it was worth investigating."
Roughly 700,000 children were abused in 2014, according to the US Centers for Disease Control and Prevention, and nearly 1,600 died from abuse or neglect.
During a typical investigation into child abuse, a human interviewer follows a strict protocol designed to reduce bias toward any party involved in the investigation. The interviewer often asks open-ended questions like, "And then what happened?"
But humans sometimes have a hard time controlling their facial expressions and body movements, which might introduce bias. Or, if a human interviewer seems like an authority figure, a child could feel too intimidated to share information.
The idea of replacing human interviewers with robots stemmed from Bethel's postdoc training at Yale University's Social Robotics Lab, where she delved into the secret-keeping behaviors of preschool-aged children. Although the results of her research weren't statistically significant, she noticed that some of the children talked more with a robot than a human.
At Mississippi State University, she and Ph.D. student Zachary Henkel developed forensic interview software for two types of robots, each just under two feet tall. SoftBank Robotics' Nao is one of the more reliable, commercially-available bots for human-robot interaction research. Nao has a wide range of body movements, but lacks animated facial features and a discernible gender. RoboKind's R25 robot, while not as physically agile as Nao, has an expressive humanoid face.
The researchers' software enables the robots to follow basic forensic interview protocol with guidance from a human operator working remotely. Cameras and sensors embedded in the robots eliminate the need for traditional, and sometimes obtrusive, video equipment and microphones.
"Early results are showing that the children seem to be as comfortable, if not more so, with sharing information with a robot interviewer, which is very promising," Bethel said.
The team's research was recently presented at the Conference on Human-Robot Interaction in Vienna.
Challenges remain, though.
One obstacle is children might view the robot as a toy and engage in a playful way that produces inaccurate accounts, Henkel said.
The next step for Bethel and Henkel is to study how an interviewer's gender - human or robot - influences the amount and quality of information provided by a child. To that end, they've collaborated with RoboKind on creating a female version of the company's standard "Milo" robot. And they are hoping to better understand the ways robot interviewers might inadvertently mislead children.
Bethel's already convinced of the robots' ability. She said that ultimately she'd like to see them used in real-world interviews. More frequent and accurate disclosures, she said, could protect children from harm - and even death.
WATCH: Why You Shouldn't Fear AI