At the INK Conference in Lavasa, serial tech entrepreneur Nam Do demonstrated Emotiv EPOC, the next generation of the human computer interface. It involves putting on a neuro headset which senses brainwave, and interprets activity on the computer on the basis of what you’re thinking. Nam Do asked a volunteer from the audience to put on a headset. The system was first trained by the volunteer thinking of a movement, say, moving a box on screen closer to you. Then, post training, just thinking about moving the same action would make the box perform the same action.

Looks like the Kinect will become outdated soon. Of course, judging by the demo, the system might take some time to evolve and perhaps for someone to be able to play a first person shooting game (wherein impulses are interpreted immediately), but this is a start.

Nam Do said that the next generation of a human computer interface will take into account not just conscious intent (what we’re saying), but also non-conscious emotions. The computer will be able to understand the material presented to you, and you’ll be able to ask a machine to perform a task.

Source link: