Sharing is caring!

Two PhD students from University of Florida, Marvin Andujar and Chris Crawford, have built a mind-machine method to control a drone through a wearable electroencephalographic (EEG) Brain-Computer Interface device. The drone operates based on user’s cognitive commands. When the user thinks forward, the drone moves forward towards the direction it is facing. While the drone flies, the user is able to view the flight from FPV (first-person view) via a front-facing camera.This project originated in 2012 when both Andujar and Crawford started to pursue their PhD. They both joined forces as Crawford has a background in Human-Robot Interaction and Andujar’s research expertise is in Brain-Computer Interfaces. The idea originated when they both went to a mall and saw the commercial drone in a store. They both brainstormed and said lets make controlling the drone with our brain a reality. Their Parrot AR 2.0 drone has been equipped with an Epoc neuro headset and takes about 3 minutes of training to learn each maneuver.

“This project serves as the beginning of brain-machine control as a human-centric application”, says Marvin Andujar, who is also a member of our Neurogadget editorial team.

The idea is to use the BCI, drones and other machines such as a humanoid as a third arm for humans. We want to provide humans efficient control of their machines while performing real world tasks. This could be helpful when both hands are full and you need to carry something else or open a door. In this case a third arm could be useful.

The next step for this research is to incorporate a two way system that monitors user’s levels of engagement, cognitive workload, and emotional state while the drone is cognitively controlled. They believe that to make the technology usable and provide a positive user experience, they need to understand how the users feel.

By Marvin Andujar

Video credits: The Division of Multimedia Properties at University of Florida