AXL: Neurokinetic Mobility Interface System
Bridging minds and machines
Inspiration
Our project was inspired by the neurotransmitter headband from Big Hero 6 - a device that allows Hiro to control his microbots through thought alone. We were captivated by this concept and wanted to create a real-world version that could help people with mobility challenges control devices using only their thoughts.
The growing field of brain-computer interfaces (BCIs) is making tremendous strides in assistive technology, and we wanted to contribute to this important work by creating an accessible, modular system that researchers and developers could build upon.
What it does
AXL is a comprehensive Brain-Computer Interface (BCI) system designed for real-time EEG signal processing and device control. The system:
- Processes EEG signals in real-time with advanced filtering and artifact removal
- Classifies mental commands using machine learning (motor imagery detection)
- Translates classified signals into device control commands
- Provides visualization tools for EEG signals and classification results
- Includes a training module for collecting data and improving classifier accuracy
AXL can be used to control external devices through thought alone, providing a foundation for assistive technology applications.
How we built it
We built AXL as a modular Python-based system with several key components:
- Acquisition Module: Interfaces with OpenBCI hardware or simulated data sources
- Processing Pipeline:
- Signal filtering (bandpass, notch filters)
- Artifact removal (ICA-based)
- Feature extraction (spectral power, wavelets)
- Classification Engine:
- Multiple ML algorithms (LDA, SVM, Random Forest)
- State management for robust command detection
- Control Interface:
- Serial communication with external devices
- Command translation system
- Visualization Tools:
- Real-time signal display
- Spectral analysis visualization
- Training System:
- Data collection with prompts
- Cross-validation for classifier evaluation
Challenges we ran into
- Signal Quality: EEG signals are extremely noisy, requiring sophisticated filtering and artifact removal techniques.
- Real-time Processing: Balancing processing complexity with the need for real-time performance was challenging.
- Classification Accuracy: Motor imagery detection is difficult - different people imagine movements differently, requiring personalized training.
- Hardware Integration: Working with the OpenBCI hardware presented several technical challenges with data acquisition rates and connectivity.
- User Experience: Creating an intuitive training protocol that doesn't frustrate users but still collects quality data.
Accomplishments that we're proud of
- Real-time Processing: Achieved <50ms latency from thought to command, critical for natural interaction.
- Classification Accuracy: Reached 85% accuracy in motor imagery classification after personalized training.
- Modular Architecture: Created a flexible system that can easily integrate with various data sources and output devices.
- Training Protocol: Developed an efficient training protocol that can build a working classifier in under 10 minutes.
- Simulated Testing Environment: Built a complete simulation mode for development without requiring EEG hardware.
What we learned
- Neurophysiology: Deepened our understanding of how the brain produces detectable patterns during motor imagery.
- Signal Processing: Learned advanced techniques for real-time biosignal processing.
- User-Centered Design: Discovered the importance of designing systems that account for the cognitive load of BCI use.
- Machine Learning Pipeline: Improved our skills in building end-to-end ML pipelines for real-time classification.
- Cross-disciplinary Collaboration: Working at the intersection of neuroscience, computer science, and human-computer interaction taught us valuable lessons about cross-disciplinary development.
What's next for AXL
- Expanded Device Support: Adding support for more consumer EEG devices beyond OpenBCI.
- Advanced Classification: Implementing deep learning methods for improved classification accuracy.
- Multi-command Support: Expanding beyond binary commands to enable more complex control schemes.
- Mobile Application: Developing a companion mobile app for easier system configuration and monitoring.
- Open Source Community: Building a community of developers to expand the capabilities and applications of AXL.
- Clinical Validation: Partnering with researchers to validate AXL in clinical settings for assistive technology.
Built With
- bluepy
- hdf5
- matplotlib
- mne-python
- numpy
- pandas
- pickle/joblib
- plotly
- pyopenbci
- pyserial
- pytest
- python-3.8+
- pywavelets
- scikit-learn
- scipy

Log in or sign up for Devpost to join the conversation.