In recent days, dedicated British tabloid readers may have spotted some rather alarming headlines splashed across the digital newsstand.
The Daily Express: “This new mind-reading machine translates your thoughts as text!”
The Daily Mail: “New mind-reading machine can translate your thoughts and display them as text INSTANTLY."
Don't worry. Headline writers for UK tabloids are better known for their enthusiasm than accuracy.
The device in question doesn't actually read thoughts. But it gets pretty close.
Researchers at the University of California, San Francisco have developed an neuroprosthesis that monitors brain activity and recognizes when a person hears a particular phrase or sentence. By analyzing the brain waves generated by speech sounds, the system identifies the words the person is hearing.
In other words, the machine doesn't transcribe spoken words; it transcribes the brain signals triggered when a person hears spoken words.
Technically speaking, the brain-computer interface is performing real-time speech decoding of brain waves, said David Moses, lead author of the new research published in the Journal of Neural Engineering. With further development, the technology could potentially help patients with severe speech impediments or locked-in syndrome.
“Each of the participants in this study listened to ten pre-recorded sentences multiple times,” Moses told Seeker. “Our software collected and processed each patient’s brain activity while the patient was listening to these sentences.”
RELATED: How the Brain Constructs Meaning From Changes in the Pitch of a Person’s Voice
Using information about the speech sounds, the software was able to map the brain activity that was observed while the patient heard the sentences.
“We can then use this mapping to reliably predict which sentence the patient hears in real-time using only his or her brain activity,” Moses said.
Right now, the system only decodes words the subject is actively listening to. But down the line. Moses said, the system could potentially decode words that subjects generate themselves.
“The way we tested the system in this first study was to use brain waves from the auditory cortex and sophisticated algorithms to decode the specific speech sounds, or phonemes, that participants heard in real-time,” Moses told Seeker. “In this study, participants were only listening to speech, but using signals from another brain region — the motor cortex — it may be possible to do something similar with speech the participant produces.”
There’s one potential drawback for the squeamish. The test subjects were all undergoing epilepsy treatment and had electrodes implanted directly on the surface of their brains, which the researchers tapped into.
“This unique situation allows doctors to identify the areas of the brain involved in generating seizures and also gives us the unique opportunity to study brain function directly, rather than using sensors placed outside the head,” Moses said.
He added that the ultimate goal of the research is to develop an external neuroprosthetic device that could help people “think out loud.”
“The system and results we present in this paper really reflect an initial and important step toward a speech neuroprosthetic device,” he said. “Ultimately, we think that a successful speech neuroprosthetic should be capable of processing neural signals in real-time, [translating] either perceived or produced speech and brain activity.”
RELATED: Brain-Computer Interface Allows Users to Compose Music With Only Their Thoughts
If that description sounds pretty close to mind reading, the point is not lost on Moses or the research team. Do we really want to be moving toward technology where machines can read our thoughts?
“This is a really important question, and one that I want to clarify and ensure that it reflects the true state of the field,” Moses said. “Unfortunately, a lot of the press coverage this and other research has gotten seems to claim that we have built a ‘mind-reading machine.’”
In fact, the technology described in the new research paper only works to decode brain activity while a person is listening to a specific set of speech sounds, Moses said.
“Fundamentally, our system is designed to help individuals who are unable to communicate — for example, patients with locked-in syndrome,” he said. “We demonstrate that this concept is possible, but for now only in extremely limited settings. It's unclear if a system like the one we designed will ever be very useful for reading thoughts — if that is ever even possible.”
So long as science proceeds carefully, including consulting with bioethicists during all phases of research and development, Moses is confident such technology will ultimately be beneficial.
“I think it will be possible to develop systems that can truly help people who are unable to communicate while also safeguarding the privacy of the mind.”