Inspiration

We were inspired by current research in EEG and brain-controlled devices, and wanted to see how far we could push it as an input method.

What it does

Our solution is a quick and intuitive method to control drones, eliminating bulky controls. You can control a drone completely with facial gestures and movements, requiring very little training to operate. It constantly records location/temperature/humidity and identifies objects/people in a video stream. We see this being useful in rescue missions and in helping people easily explore new environments - without requiring an experienced drone controller, and without the danger of being in the environment.

How we built it

We integrated the Muse headband with the Tello drone, drawing on EEG research. We apply signal processing paradigms onto input EEG waveforms to associate different waveforms with unique facial gestures. We then extract a movement "intention" and move the drone in that direction. We focused on making the controls intuitive and immersive; we wanted to make it feel like an extension of the human user.

Challenges we ran into

We were unable to get the video feed working simultaneously with the drone controller and signal processing units. We attempted a multi-threaded approach by running 3 processes in parallel, but were unable to get it working properly. Only 2 processes would run simultaneously :(

Accomplishments that we're proud of

We were able to extract movement intention with relatively high fidelity!

What we learned

We experienced a rigorous design and prototyping process in 24 hours, and learned plenty about how we work as a team!

What's next for EagleEye

We hope to integrate machine learning algorithms to tailor the signal processing to variations in EEG waveform behaviour.

Built With

Share this project:
×

Updates