The NEURALEX story
Inspiration
Have you ever wanted to turn off your lights with just your thoughts? So did we! Our team wanted to explore how Brain Computing Interface (BCI) technology could be applied to solve real-world problems.
We discovered an issue where individuals with physical or vocal impairments struggle to interact with IoT devices. To counter this issue, we developed a prototype that records brain waves in real-time and predicts and intended outcome.
What it does
NEURALEX uses a BCI headset with 4 electrodes located across your scalp to measure and record brain waves in real time. Our software then processes the recorded data and uses our machine-learning model to make a prediction of your intended actions. Here’s a breakdown of how NEURALEX works:
- Users record their brainwaves using a BCI headset.
- NEURALEX processes that data, and allows users to visualise their brainwaves.
- NEURALEX’s machine learning model uses the recorded data to make a prediction.
- Predictions can then be mapped to specific outcomes. In our prototype, we used the action of concentrating to turn on an LED light.
How we built it
Hardware:
- Muse 2 headset was used to measure and record brainwaves, passing it to our backend.
- Arduino Uno was used to prototype an IoT device. A simple circuit was set up to connect an LED light to our solution. When the Arduino receives a signal that we are concentrating, the LED is lit up, else, the LED will turn off.
Backend: As part of the backend, there are three Python notebooks that were used to collect data (data-collection.ipynb), process data (data-processor.ipynb), and use the data to train our machine learning model to predict the user’s intention (data-modeling.ipynb). Furthermore, we have a helper.py file that contains class definitions to help streamline and modularize the data collection, processing, and modeling processes. The model trained is then utilized in the demo to make predictions on the data. Note that there were some issues with the library associated with streaming data from local applications, and hence, a modified version of the demo where no data recording was possible has been uploaded. For the actual product demo video, however, our local demo did use the user’s brain waves to light up the LED light.
Frontend: With the front-end, we used streamlit to quickly make a hostable data application. Data was accessed through the same Github repository used to host the application. Plots, text, buttons, and all user interface graphics are provided by the streamlit library. Note that the version hosted on streamlit removes recording functionality due to an issue with the associated library.
Challenges we ran into
Showcasing our solution: We found difficulty implementing interactivity for hardware-based solutions due to the requirements.
Hosting: We found difficulty hosting the web app on Streamlit, as it was our first time doing so.
Accomplishments that we're proud of
Model accuracy: We were able to achieve over 80% accuracy on our machine-learning model. We are proud of the ability of our solution to predict the action of whether we are concentrating or doing nothing.
Successful prototype: We were able to create a fully functioning hardware prototype. It was really satisfying to turn on the LED light just by concentrating on it.
What we learned
Writing modular code: In writing the code, care was taken so that it is modular and able to be expanded upon more easily in the future than if the code was not modular. This was not something I am used to, however, it was great to learn how to do so.
Insights into designing BCI experiments: While testing the performance of the model with live predictions, we were unsure if the model was inaccurate or if our brain activity was inaccurate. Turns out, the model was predicting that I was concentrating because I was too focused on doing nothing. Clearer definitions should be used for distinct actions in the future.
Hosting on Streamlit: It is our first time using this library to host our application. This resulted in some hair-tearing errors, which had to be fixed with an alternative solution.
What's next for NEURALEX
Customer interviews and live user testing to find out how we can better cater this solution to empower users.
Future extensions:
Monitoring and early detection system for neurological disorders: with a BCI headset recording brainwaves, users will be able to monitor and track their brain activity. Doctors will be able to identify and manage neurological disorders like dementia earlier.
Continuous authentication: Each individual has unique brainwaves. With enough data, it would be possible to create a more secure form of biometric authentication. One where users have to wear a headset to be able to access sensitive information.



Log in or sign up for Devpost to join the conversation.