About the project

Inspiration

Accurately measuring a child’s attention level in clinical or educational settings is challenging. Traditional approaches often rely on behavioral observation or questionnaires, which can be subjective and influenced by mood, environment, or anxiety. We wanted to create a tool that helps clinicians objectively assess focus through neurofeedback — but in a way that feels natural and engaging for children. With NeuroDash, we reimagined focus measurement as play. Children “drive” a virtual car using their brain activity instead of a controller, transforming what could be a stressful test into an interactive, enjoyable experience. As focus increases, so does the car’s speed — allowing clinicians and caregivers to visualize attention in real time through gameplay. Now, instead of using a static beta power threshold, NeuroDash incorporates a trained EEGNet4Ch model to predict focus directly from raw EEG input, making the system more robust to artifacts like blinks or shifts in attention.

What it does

NeuroDash is a neuroadaptive focus-training platform that merges EEG-based attention measurement with interactive gameplay:

  • The OpenBCI Ganglion headset captures EEG signals from the player.
  • Real-time signal processing applies either traditional beta/alpha computations or the EEGNet4Ch model to calculate a focus score.
  • This focus score directly controls the speed of a car in a Pygame environment — the more focused the user, the faster the car moves.
  • Clinicians can view live data trends, including focus intensity and session metrics, via a separate Streamlit dashboard. Together, these components form a closed-loop system that encourages sustained attention and makes neurofeedback training engaging and measurable.

How we built it

We built a two-part system integrating hardware, software, and visualization tools:

  • Hardware: OpenBCI Ganglion Board (4-channel EEG headset), Rasberry Pi
  • Backend: Python + OpenBCI SDK + NumPy/SciPy for signal filtering and focus computation (beta/alpha ratio)
  • Communication: Real-time UDP socket streaming between the EEG processor, game, and dashboard
  • Frontend (Patient): Pygame — a gamified race controlled by focus intensity
  • Frontend (Clinician): Streamlit — a live monitoring dashboard displaying EEG metrics and trends This setup allows EEG data to be processed, visualized, and applied to gameplay mechanics simultaneously and with low latency.

Challenges We Faced

Signal Processing & Calibration

  • Personalized Thresholds: Each individual's baseline brain activity is unique, requiring adaptive calibration per user.
  • Noise Reduction: Movement and muscle artifacts distort EEG signals during gameplay, necessitating real-time filtering.

Infrastructure & Networking

  • Network Restrictions: School WiFi blocked SSH connections to the Raspberry Pi. Solution: Implemented Tailscale VPN to bridge networks — Pi on mobile hotspot and laptop on school WiFi.
  • LSL Stream Connectivity: LSL requires devices on the same physical network. Solution: Used UDP forwarding through VPN to enable real-time EEG data streaming between laptop and Pi.

System Integration

  • Real-Time Synchronization: Optimizing low-latency data transmission between EEG device, processing server, and game loop.
  • Raspberry Pi Deployment: Resolved compatibility issues through virtual environment setup and custom launch scripts.

User Experience Design

  • Dual-Interface Architecture: Balancing engaging gameplay for children with data-rich monitoring for clinicians.

Accomplishments that we're proud of

  • Integrated live OpenBCI EEG data into an interactive Python game.
  • Implemented a real-time focus computation pipeline using both beta/alpha ratio and a machine learning model.
  • Built a clinician dashboard for live EEG visualization and tracking.
  • Created a measurable neurofeedback experience that combines neuroscience, gaming, and therapy.

What we learned

  • How EEG frequency bands correlate with attention and relaxation.
  • The importance of preprocessing (filtering, normalization, windowing) for reliable focus detection.
  • Real-time data streaming and feedback design for dual audiences.
  • Leveraging machine learning to personalize focus prediction.

What's next for NeuroDash

  • Personalized Calibration: Adaptive ML models to predict focus more accurately per user.
  • Clinical Trials: Collaborate with therapists to evaluate attention improvements over time.
  • Multi-Modal Feedback: Visual, auditory, and haptic feedback for deeper engagement.
  • Longitudinal Tracking: Session logging for progress analysis and treatment planning.
  • Scalability: Expand to tablet or VR formats for home or clinic use.

Built With

Share this project:

Updates