Inspiration
Fatigue is one of the most debilitating symptoms of Multiple Sclerosis, yet it remains hard to quantify and manage. My research on neurogaming and EEG-based fatigue biomarkers inspired me to create a demo that makes brain states visible in real time. By turning raw EEG into 3D art, the project explores how neurofeedback can empower people to better understand and regulate their fatigue.
What it does
The system reads EEG signals from a DIY neuroscience kit, computes band power for:
- theta (4–7 Hz)
- SMR (12–15 Hz)
- beta (13–30 Hz)
From these bands it derives two indices:
- Focus/Relax ratio
$$( \text{ratio} = \tfrac{\beta}{\alpha + \beta} \)
- Fatigue index
\( \text{fatigueI} = \tfrac{\theta}{\text{SMR}} \)
These values are streamed via OSC into TouchDesigner, where a 3D noise surface shifts in amplitude, speed, and color to reflect the brain’s state:
- relax
- focus
- fatigue
- neutral
The result is an immersive, adaptive visualization of mental performance and fatigue.
How we built it
# Example of real-time EEG streaming
osc.send_message("/brain/ratio", ratio)
osc.send_message("/brain/fatigueI", fatigueI)
osc.send_message("/brain/state_str", state)
- Arduino + Python: captured EEG packets, computed band power with SciPy filters.
- OSC streaming: used
python-oscto send real-time metrics to TouchDesigner. - TouchDesigner: created a 3D noise-based visualization, where amplitude = focus ratio, animation speed = arousal, and color overlay = fatigue detection.
- Closed-loop logic: thresholds adapt live to keep the system responsive during short demo sessions.
Challenges we ran into
- Setting up stable serial communication and avoiding Windows COM port crashes.
- Learning TouchDesigner’s SOP/MAT pipeline under hackathon time pressure.
- Balancing realism with simplicity: adding fatigue detection without overfitting noisy EEG data.
Accomplishments we’re proud of
- Built a working real-time brain-to-visualization pipeline in <48 hours.
- Extended the code to detect fatigue in addition to focus/relax.
- Delivered an intuitive 3D demo that makes invisible brain states tangible.
What we learned
- How to stream physiological data into a visual engine like TouchDesigner.
- The value of starting with minimal but meaningful features (alpha/beta ratio, theta/SMR).
- That closed-loop neurofeedback can be prototyped with DIY tools in a hackathon setting.
What’s next
- Refine fatigue detection with more robust biomarkers.
- Integrate VR or gaming environments for engaging training scenarios.
- Collaborate with clinicians to test the prototype in MS fatigue management.

Log in or sign up for Devpost to join the conversation.