By Corinna Lathan, Apr 25 2014
Every human-computer interface is really a brain-computer interface; it’s just a matter of degree. Our intentions may be sent from our brain to the computer through our fingers and a keyboard, through a camera that tracks eye movement, or from sensors that read signals from the surface of the scalp or from individual neurons. It’s a continuum.
However, when we talk about “brain-computer interfaces” (BCIs) today, we are talking about capturing signals directly from the brain and using them to control an electronic device. This can be done in a few ways, such as through electroencephalography (EEG) sensors that record electrical impulses from the brain, or functional near-infrared spectroscopy (fNIR), which uses light to monitor blood flow in the brain. Though sensors can also be implanted, these less invasive technologies will work if the sensors are worn on headsets that keep them in contact with the scalp.
These technologies are not mind readers just yet, but they can be trained to recognize patterns in controlled scenarios. We’re very far away from me putting an EEG cap on your head, and while you think “red car”, I am able to know that you are thinking about this red car. What we are able to do now, for example, is train the system to recognize a choice of four icons or to know that you’re thinking “red car” versus “playing tennis”.
Thus, a patient with locked-in syndrome might train a BCI to distinguish between two thoughts, such as “playing tennis” versus “walking down the street”, and these could become their “yes” and “no” signals. To an extent, it doesn’t matter what the two thoughts are. We can’t say which neuron in the brain fires when you think about playing tennis, but we can train a BCI to distinguish between that electrical pattern and another pattern.
Just as everyone walks in much the same way, but with differences in gait, pace and so on, so we use the same types of brainwaves for the same kinds of mental activities though there will still be differences between individuals.
As BCIs have advanced we have built up a “library” of signals, so that we can create devices that have the ability to track three, four, or five patterns. We already have robust technology tools that enable us to obtain clean signals from the brain. It used to be necessary to wear 128 or even 256 leads on your head to get any useful information. Now we can get meaningful data with 16, eight, or even four leads, depending on the task and the signal of interest. Now the magic will be in the software and what it can do with those signals.
In time we will be able to use these devices seamlessly for tasks such as controlling the cursor on a computer screen or interacting, hands-free, with mobile phones. One of the projects that we’re currently working on with the U.S. Navy is how to use both BCIs and physiological sensing to optimize individual and team training.
For example, to make training as effective as possible, you could have a BCI that monitors whether you are paying attention properly. If your attention wanders, the computer could alert you or ask you to explain the material that was just covered. It would be part of an intelligent tutor that paces the learning and content to match your focus and attention.
BCIs could also be used to monitor employees in high-stress environments, such as air traffic controllers, or to identify post-traumatic stress disorder in military personnel or concussion in players of contact sports.
Given the advances in BCIs, it seems crazy that every time you visit the doctor your blood pressure, temperature, height and weight are checked, but not your brain vital signs. You don’t need a sophisticated BCI to track brain health, even things such as reaction time tests can be a good indicator of your brain’s processing speed.
EEGs are now just passive monitoring, but it’s easy to imagine a future where energy can be directed into the brain. Last year, scientists at MIT used light to activate cells in genetically modified mice to implant a false memory into their brains. We’re a long way from being able to do that with human beings, but we could see an extension of EEG technology to determine when your brain was in a state where it was most receptive to learning.
One profound application for BCIs will be awareness of other people’s emotions and brain states. Scientists at Princeton University have looked at speaker-listener pairs with both EEG and brain imaging and have shown that when two people are communicating, speaking and understanding each other, their brains are literally on the same wavelength. Not only that, the listener’s brain wave patterns start to precede the speaker’s brain wave patterns. You start to actually anticipate the other person’s brain waves.
That type of data will have a profound impact on the way people interact. Imagine going into every meeting knowing exactly who’s paying attention to you, who’s on the same wavelength as you, literally. Imagine having that kind of information. It will change every single dynamic that you encounter.
To learn more about new technology trends sign up for the Global Information Technology Outlook module. Also discover other modules on various topics on Forum Academy, the online professional leadership development platform of the World Economic Forum.
Author: Corinna Lathan, Founder and Chief Executive Officer of AnthroTronix, an engineering research and development company; member of the World Economic Forum’s Global Agenda Council on Robotics & Smart Devices
Image: A woman poses with a brain-computer interface in Hanover, April 22, 2012. REUTERS/Morris Mac Matze
Go to original article…