At the LeWeb’10 conference in Paris, Ariel Garten, CEO of Interaxon hold a speech about the future possibilities of brain-
computer interface technology. Parts of her speech can be read below, and the complete presentation is also available on video:
As the Wall Street Journal Blog reports, Ariel was wearing a single, and relatively discreet sensor attached to her forehead to show if her actual mental statement was relaxed or concentrating.
Clearly a confident public speaker, she spent most of her time in front of the 3000 delegates in a pretty relaxed state. It was only when she had to speak French, or was concentrating on a delegate’s questions, did her brainwaves shift from a relaxed alpha state to a focussed beta state.
Trained as a neuroscientist, Ms. Garten has been variously a fashion designer and a therapist before launching her company four years ago. “We are completely self-funded,” she said. “We bootstrapped ourselves. The initial money came out of my own pocket but that was rapidly paid back.”
Her company, Interaxon uses existing sensor hardware by Californian company NeuroSky and the Australian-based Emotiv Systems. “We do research to understand what new signals we can pull out, what new things you can do with new interactions, what new areas open up. And then we produce the applications.”
This is the early days of this technology. “It is a technology that allows you to know more about yourself. It allows you to understand where you brain is at.”
In her Paris presentation she showed a real life example: an iPad app linked to a headset, on its screen a rotating object. The more you concentrated on the object, the faster it rotated.
Thought-controlled computing had been waiting for computers to be powerful enough to handle the real-time signal processing, said Ms. Garten.
“The ability to crunch brain waves on the fly in anything more than one channel requires a lot of processor power. So to be able to crunch brain waves and run an application in real time also takes a lot of processing power. One of the things that has helped is things like MRI technology that allows us to pin point more areas of the brain and parts of the brain we can think about.”
“It is hard to predict where the technology is going because you have two different factors. One is where the technology is, and the second is the adoption rate.”
“So presuming our adoption in the next few years means in three to five years, your average person is comfortable wearing a single sensor system, then we can probably have some pretty interesting interactions in the ten year time scale where you are controlling a mouse, controlling the lighting in your home, and lots of other everyday interactions in a pretty seamless way.
“At that point I don’t think you are going to be thinking ‘blue’ and your screen will turn blue, that is something that is probably at least 20 years down the road, but we will probably be pretty comfortable using brain waves to control simple things.”
Every techie needs a pair of sick headphones. Neurogadget recommends these Audio Technica Professional Studio Monitor Headphones for both their quality and their cool-factor.