Mind Meld! Top Brain-Controlled Techs

Science expands what we can achieve with just our thoughts.

Neuroscientists are racing to perfect brain-computer interfaces and as they do, the line between sci-fi and reality blurs. Iron Man exoskeletons, Matrix-style brain downloads and Vulcan mind-melds feel within reach.

"I think nothing is impossible up front," said Deniz Erdogmus, an associate professor of electrical and computer engineering at Northeastern University specializing in noninvasive brain-computer interfaces or BCI for short. He and his colleagues are developing a brain-controlled keyboard to help people with speech impairments communicate rapidly.

While we're not at Spock level yet, Erdogmus shares some of the most promising brain-controlled tech out there.

When the connection between brain and limb is severed, a robotic arm could bridge the gap. Last year the BrainGate research team, including Stanford's Krishna Shenoy and Brown University's John Donoghue, helped a paralyzed woman operate a sophisticated robotic arm to sip coffee with just her mind. And a different patient was able to feed herself string cheese through thought with University of Pittsburgh neurobiology professor Andrew Schwartz and his team's robotic arm called Hector.

Meanwhile, the international Walk Again Project led by Duke University Center for Neuroengineering co-director and medical doctor Miguel Nicolelis has a more ambitious goal: a whole a brain-controlled robotic exoskeleton.

But such intense control currently requires surgical interventions. "My guess is invasive technologies will be the choice in the future if you really want to control a robot hand like in Star Wars, for example, where they put this robot hand on Luke Skywalker," Erdogmus said.

The idea to construct Iron Man-like exoskeletons controlled by the brain has been around for a while, Erdogmus said. But in reality the driving forces are medical and include restored movement for paralysis patients.

Last year the noninvasive MindWalker exoskeleton debuted in Belgium and is currently in clinical trials. This exoskeleton is intended to work with a cap containing lightweight EEG biosensors, although New Scientist reported recently that the motors have interfered with the signal readings. The project team expects to begin commercializing the MindWalker in five years.

While cool, robotic appendages aren't number one on the wish list. "According to my clinical collaborator who deals with locked-in people on a daily basis, people want as much independence as possible," Erdogmus said. "The top desire is to communicate."

Several teams are advancing machines to decode brain signals and translate them into speech or text. When Boston University neuroscientist Frank Guenther and his team collaborated with Phil Kennedy of Neural Signals Inc., they made headlines by helping a lock-in man use his mind to articulate vowels in 2009.

Last year Brian Pasley at the University of California Berkeley and his colleagues demonstrated an algorithm to interpret signals from patients having surgery to treat epilepsy or a tumor. Another program reconstructed what they were thinking. While it could only parse some words, it's closer to a thought translator.

Robot arms are growing more sophisticated all the time and the degrees of freedom are approaching human. Still, controlling a robot arm with your brain still isn't the same as moving paralyzed limbs again.

Northwestern University neuroscience professor Lee Miller and his colleagues want to revive that control. Last year they debuted a brain-machine interface that bypasses the spinal cord and delivers messages straight from the brain to the muscles in a paralyzed hand. In a study with monkeys, their decoder allowed them to grasp and move a ball.

"It is one of the most natural and exciting approaches to restoring function in people with paralysis," Erdogmus said.

When you can't manage tasks by yourself, perhaps a brain-controlled robot could assist? Computer science and engineering professor Rajesh Rao and his colleagues at the University of Washington are developing interfaces for doing just that.

Rao's group tapped a humanoid PR2 robot called Hobbes made by robotics research lab Willow Garage. Then they created a system so that a human wearing an electrode cap could monitor and control Hobbes remotely, getting him to perform basic tasks like selecting an object from a table, picking it up, and moving it to a specific spot.

The robot succeeded even though the system relies on surface-level brain signals. "From my perspective, we can design applications that would suit people who don't want their skull drilled," Erdogmus said. "If the performance is acceptable, it's less risky."

University of Minnesota biomedical engineering professor Bin He and his students are also pushing the limits of noninvasive BCI. Recently they demonstrated a system (video) for flying a small remote-controlled helicopter by picturing one fist or two fists. An electroencephalography or EEG cap reads and transmits the signals so the operator can steer.

"The invasive people say that to control robots with highly complicated dynamics, you need to go deep into the brain," Erdogmus said. "The noninvasive people say you can do similar things with EEG only." He pointed to research by Jonathan Wolpaw, chief of the Wadsworth Center's Laboratory of Neural Injury and Repair, showing that EEG can be used just as effectively as invasive techniques to control a cursor.

Wolpaw's advancements also inspired entrepreneur Steve Castellotti, known for creating a brain-controlled helicopter called Orbit that works with open source software. It's introduces BCI to a new generation.

Cars can already drive by themselves so it's no surprise that scientists are working on ways to make them even smarter by hooking them to our brains. In 2011, German researchers at the Freie Universitat Berlin demoed the BrainDriver, a car that can be steered using the driver's EEG. But the driver still had to go through mental training first.

Erdogmus referenced work by Scott Makeig, research scientist and director of the Swartz Center for Computational Neuroscience at the University of California San Diego. His system could give drivers alerts if they appear to be falling asleep. Last year Makeig put new lightweight mini EEG scanners into a helmet to monitor airplane pilots' mental states.

"People in the noninvasive area call these types of interfaces passive brain-computer interfaces," Erdogmus said. And they're becoming more accurate all the time.

When he's not working on a thought-to-speech translator, Boston University's Frank Guenther is part of a multi-university initiative called the Unlock Project. The idea: make an open-source brain-computer interface so that locked-in people can control everything in their homes. Unlike other BCI applications, the Unlock Project works with noninvasive EEG.

Erdogmus said he and his colleagues recently started partnering with the project. Already, they have open-source applications like text entry to a computer or changing the TV channel, he said.

Scientists call the technology a "brain-to-brain interface" but the effect would be similar to a real Vulcan mind-meld. Leading the charge is Duke University's Miguel Nicolelis.

This year he experimented with implanting electrodes in rats on different continents. They'd been trained to do simple tasks and then split into "decoder" and "encoder" groups. Microelectrodes were implanted in their brains and they were linked through the Internet. Although the results were met with some skepticism, Nicolelis reported that the decoding rats chose the same lever as the encoder rats 60% of the time.

Given the progress so far, particularly with DARPA-funded research, Erdogmus thinks mind-melds could become real one day. "I wouldn't give up," he said. "I'm sure in the future people will figure out how to do it."

If we can think it, we can control it.