Inspiration
Focus is becoming an endangered skill. With the world’s knowledge and endless distractions in our pockets, maintaining deep focus feels nearly impossible. Over 6 million children in the U.S. are diagnosed with ADHD, and millions of adults face attention regulation struggles every day. Studies show the average human attention span has dropped to just 8 seconds, and modern environments continue to make it worse.
Most productivity tools fail because they rely on timers, self-reporting, or checklists, all of which assume the user is already motivated and self-aware. We asked ourselves: “How do we truly know if we’re focused?”
That question inspired us to build a system that measures focus directly from the brain, not just from behavior.
What it does
Our project is a brain-wave powered productivity tool that uses EEG signals to measure attention in real time.
- Continuously monitors neural activity through a Muse EEG headband.
- Processes EEG signals using a custom machine learning pipeline.
- Classifies brain states as focused or distracted with high accuracy.
- Triggers alerts when unfocused states are detected (not just random noise).
- Provides real-time dashboards and historical analytics, including focus duration, distraction patterns, and cognitive performance trends.
How we built it
We started by connecting the EEG headband over Bluetooth, streaming raw brainwave data through the Lab Streaming Layer (LSL). Using muselsl, we captured data from all 5 channels (AF7, AF8, TP9, TP10, AUX) at 256Hz.
From there, we built a custom signal processing pipeline. We applied bandpass and notch filters to clean the signals, removed artifacts like blinks and motion, and extracted meaningful features such as band power in Alpha, Beta, Theta, Delta, and Gamma ranges. These features were then organized into 6-second sliding windows to make them suitable for classification.
On the ML side, we trained a HemiAttentionLSTM, a bidirectional LSTM with a cross-hemispherical attention mechanism. The idea was to model how the left and right hemispheres interact during focus. Our network used 128 hidden units, 2 layers, and dropout for regularization. To handle class imbalance between focused and distracted states, we used focal loss and a WeightedRandomSampler. For comparison, we also implemented a Random Forest classifier as a simpler baseline.
The backend was built with FastAPI, running an asynchronous WebSocket server to stream live predictions. We built a Next.js dashboard to visualize focus in real time. The frontend connects to the backend over WebSockets and displays all insights and graphs.
Challenges we ran into
- Creating our own dataset
- Handling EEG signal noise from blinks and motion artifacts.
- Training on highly imbalanced data (longer focused states vs. shorter distractions).
Accomplishments that we're proud of
- Successfully integrated real-time EEG classification with live WebSocket streaming.
- Developed a novel hemisphere-aware attention model for focus detection.
What we learned
- How to preprocess and extract meaningful features from raw EEG signals.
- The importance of handling class imbalance in physiological datasets.
What's next for Untitled
- Expand beyond EEG to include multimodal sensing
- Improve the ML pipeline with larger datasets and transfer learning
Built With
- eeg
- fastapi
- lsl
- muselsl
- next.js
- pylsl
- pytorch
- scikit-learn
- uvicorn
- websockets

Log in or sign up for Devpost to join the conversation.