Mind-Reading Computer Knows What You're About to Say
The research could one day give us the ability to control robots with a thought.
In what has to be the ultimate rock-paper-scissors game, a computer is able to decode what choice its opponent has made before they move a hand.
Sophisticated mind reading seems more possible than ever.
The research was spearheaded by Toshimasa Yamazaki, a computer science and systems engineering professor in the Department of Bioscience and Bioinformatics at Kyushu Institute of Technology in Japan. He and his team studied a dozen men, women and children using an electroencephalogram or EEG to detect their brainwaves.
First he had them say "rock," "paper," and "scissors" in Japanese. Then he had them concentrate on each word without saying it, RocketNews24 reported. The results varied by person, but computer readings showed similar brainwave patterns for the spoken and unspoken tests - to the point where Yamazaki's team could frequently figure out which word the person was thinking.
Measuring the brain signals and doing analysis like this is nothing new, but Yamazaki's work is focused on part of the brain called the Broca area. It's full of motor neurons involved in speech production and language comprehension. He also had the advantage of the Japanese language because vowels have proven easier to read through EEG than consonants.
The team's algorithms could also regularly identify when test subjects were thinking the Japanese words for summer, fall, winter and spring. Their setup accurately identified single characters up to 88 percent of the time, the Daily Mail reported.
Although everything happens incredibly fast, the research shows that our brains are helping our bodies form the words before we actually speak them. Of course, anyone with foot-in-mouth syndrome understands that all too well.
Granted it's only a few words for now, but Yamazaki and his colleagues think their technology could ultimately help people who have lost the ability to speak communicate again.
And he doesn't shy away from futuristic visions. "Sci-fi-like applications become possible, such as controlling robots with our minds," he told the newspaper Nishinippon Shimbun.
English-speaking neuroscientists can tell you how much tougher it is to achieve comparable results with all our consonants. That hasn't stopped teams stateside from working hard to get us there, though. Given the difference, maybe we should learn some basic Japanese now, just in case.
If we can think it, we can control it.