These days you can control anything with an API from your phone, watch, Alexa, Siri, etc... But this involves buying more and more gadgets and investing into some company's IoT ecosystem.
There will be a day when everything is based on BCI (Brain-controlled interface). I mean, Elon Musk just started a BCI company called Neuralink so come on...
How I built it
This summer I made a Raspberry Pi-based robot called Ro, with various sensors and a web-based dashboard to control it remotely.
I just purchased an Emotiv Insight EEG headset that should arrive before the hackathon.
I'm going to try to get it to control the robot with just thoughts, and call the new creation NeuRo.
Challenges I ran into
The challenges I will run into are whether I can accurately map thoughts to "mental commands" using the Emotiv SDK.
What's next for NeuRo: Mind-controlled Robot
Once the headset arrives I'll test its accuracy and then set up the project's boilerplate work like making sure it can talk to the Pi via Bluetooth, etc... If it doesn't work I'll pivot it to demonstrate whatever I can get working with the Emotiv.