AI Interviewers Entice Reluctant Soldiers to Report PTSD Symptoms
When soldiers are allowed to build rapport with artificially intelligent interviewers in otherwise anonymous interviews, they become much more likely to report signs of PTSD.
After almost 17 years of fighting in Afghanistan, the United States is in the midst of the longest war in its history — with no end in sight. The human costs, both for Afghanis and Americans, have been tremendous. Even soldiers who return from the battlefield without physical injuries often suffer devastating bouts of Post Traumatic Stress Disorder, or PTSD, which can lead to suicide.
Yet in many cases, especially milder ones, PTSD can go undiagnosed. One reason is that soldiers sometimes fear their careers could be jeopardized if a diagnosis is put on their record. Written tests administered anonymously show soldiers report higher rates of PTSD symptoms when their names aren’t being attached to the results. Yet those anonymous questionnaires have a shortcoming: They lack the personal connection that patients may feel with a human interviewer.
“When people feel rapport with a friend or doctor, they’re more willing to open up and say, ‘I’m going through these symptoms,’” Gale Lucas, a researcher and mental health specialist at the University of Southern California, told Seeker. “To most people, it might seem contradictory. You can only pick anonymity, or connectivity. But you can’t have both.”
Now, in the open-access journal Frontiers of Robotics and AI, Lucas and her colleagues describe a novel technique using artificially intelligent interviewers to close the gap between anonymity and personal rapport in identifying symptoms of PTSD.
An early study by Lucas’s team at the University of Southern California found that using artificially intelligent interviewers revealed vastly higher rates of PTSD among participants than either regular testing methods or anonymous surveys.
In a group of volunteers from the Colorado National Guard who took the US military’s standardized Post-Deployment Health Assessment, an average of one in four people reported symptoms of PTSD.
When the study was randomized — meaning the diagnosis couldn’t be traced back to the individual, or harm their career — the study revealed that one in three of the subjects showed symptoms of PTSD.
But then researchers deployed Lucas’s new method: letting the soldiers answer a series of questions about their experiences asked by a virtual interviewer before switching to questions meant to identify symptoms of PTSD.
That technique proved far more effective than the other two: Three out of four volunteers revealed symptoms of PTSD after building rapport with the virtual interviewer.
“Ultimately, the virtual-human systems could be used to provide feedback and tell them what their risk is,” Lucas said. “Anonymity is a necessary condition to allowing war-fighters to report their PTSD, but it turns out we can do even better. These people are faced with stigma around mental health, but this system can give them a safe space to talk and open up — a place where there’s no stigma. This technology is to get people to start seeing they might have symptoms.”
While the virtual interviewer can help soldiers identify their own symptoms, Lucas said she personally believes that when it comes to treatment, real humans should still be the ones to conduct therapy sessions.
“My personal perspective is that we should let technology do what it’s good at, and let humans do what they’re good at,” Lucas said. “I believe human therapists are very good at their jobs, and I don’t intend to replace them any time soon. But screening, getting people to think that they might need therapy, is where computers can help.”
WATCH: Where the Future of AI Is Headed