Inspiration

I was inspired by how people with limited mobility struggle to interact with devices. Watching my grandmother’s health challenges and seeing friends with accessibility needs pushed me to explore brain and heart signals as a way to control technology without touch. The goal was simple: make everyday interaction easier, dignified, and stress‑free.

What it does

NeuroSynapseOS takes brainwave (EEG) and heart signal (ECG) data, fuses them together, and turns them into commands that control Android devices without touch. It can also detect stress levels from the signals, making everyday interaction easier and safer for people with accessibility needs. Right now it runs on simulated data, but the concept is built for real‑world assistive tech

How we built it

I started by setting up a simulated environment using open datasets for EEG and ECG signals. These signals were fused together in Python notebooks to mimic brain and heart activity. We connected the output to Android controls through lightweight APIs, showing how a phone could be operated without touch. Stress detection was added by analyzing ECG patterns. The whole system was built modularly, so future hardware integration can plug in easily. GitHub was used for version control, and the demo runs on simulated data to keep it accessible and judge‑friendly

Challenges we ran into

I faced countless challenges while building this project. I was working on an i3 Windows laptop with limited power, using only a phone hotspot at home without broadband. The system kept hanging, heating up, and I failed nearly 30 times before getting it right. Uploading files, running notebooks, and keeping everything stable was a constant struggle. But I never gave up — every failure taught me something new, and step by step I pushed through to make the demo work.

Accomplishments that we're proud of

Despite limited hardware, no broadband, and nearly 30 failed attempts, I managed to build a working demo that fuses EEG and ECG signals for no‑touch Android control and stress detection. I’m proud that the system runs on an i3 laptop with just a phone hotspot, proving that innovation doesn’t need expensive resources. Most importantly, I showed how simulated neurodata can be turned into real accessibility solutions, and that persistence can turn repeated failures into a working project

What we learned

I learned that persistence matters more than perfect resources. Working alone on an i3 Windows laptop with only a phone hotspot, I failed nearly 30 times, but each failure taught me something new. I discovered how simulated EEG and ECG data can still demonstrate real accessibility impact, and how modular design makes future hardware integration possible. Most importantly, I learned that even with limited tools, determination can turn repeated struggles into a working project.

What's next for

My next step is to move from simulated data to real EEG and ECG hardware integration, so the system can be tested in real‑world scenarios. I plan to refine the Android control layer to make it smoother and more responsive. Stress detection will be improved with advanced signal analysis, and I want to connect with accessibility communities to validate the impact. Long term, my vision is to grow NeuroSynapseOS into a full assistive platform that helps people with limited mobility interact with technology with dignity and independence.

Built With

Share this project:

Updates