Inspiration
Millions of people worldwide suffer from conditions like ALS, severe paralysis, strokes, and other motor disabilities, leaving them unable to speak or type. Traditional assistive communication devices can cost $7,000-$15,000, putting them out of reach for most.
We were inspired to create a solution that removes these barriers and makes communication accessible to everyone. The idea struck when we realized that almost everyone already has a tool powerful enough to detect eye movements: a standard webcam. If AI could translate blinks into speech, we could give people their voice back—instantly and for free.
What it does
EyeCode allows users to communicate using only their blinks. The system detects short and long blinks via a webcam, translates them into Morse code, converts the code into text, and then speaks it aloud using AI-powered text-to-speech.
With EyeCode, users can communicate independently, without expensive equipment, and with minimal setup. Whether for casual conversation or urgent communication, EyeCode provides a fast, reliable, and accessible solution for millions of people worldwide.
How we built it
To develop EyeCode, we combined computer vision, AI, and real-time web technologies:
- Computer Vision & ML: MediaPipe Face Mesh tracks the face at 30 FPS, calculating Eye Aspect Ratio (EAR) to detect blink durations. Short blinks are mapped to dots and long blinks to dashes in Morse code.
- Backend: Python 3.9+ with Flask handles video processing, Morse decoding, and TTS, running multi-threaded to manage video, web, and audio streams simultaneously.
- Voice: ElevenLabs TTS converts decoded text into natural speech in real time.
- Frontend: HTML/CSS/JavaScript with MJPEG video streaming and real-time polling ensures a responsive interface.
- Customizability & Accessibility: 12+ tunable parameters allow users to adjust blink sensitivity, speech speed, and interface preferences.
Challenges we ran into
- Blink detection accuracy: Variability in lighting, webcam quality, and eye shape made detecting blinks reliably challenging. We solved this by tuning EAR thresholds and smoothing signals over multiple frames.
- Real-time processing: Streaming video while decoding Morse and generating speech required multi-threading and careful performance optimization.
- User-friendly interface: Ensuring EyeCode works both for technical and non-technical users required iterative UI testing and simplification of controls.
- Morse learning curve: Some users found Morse code difficult at first, so we added visual feedback and adjustable blink timing to help users adapt.
Accomplishments that we're proud of
- Successfully creating a fully functional, webcam-based AAC device that costs $0.
- Achieving 90%+ blink detection accuracy at 30 FPS for real-time communication.
- Seamlessly integrating AI-driven blink detection, Morse code translation, and natural-sounding speech.
- Making EyeCode open-source and customizable for accessibility and research purposes.
- Demonstrating the potential for low-cost, high-impact assistive technology that can empower millions globally.
What we learned
- Computer vision in real-world applications: Learned how to track subtle facial movements under varied conditions.
- Real-time systems design: Gained experience building a multi-threaded pipeline combining video, AI inference, and TTS.
- Accessibility-focused design: Understanding the importance of user-centric interfaces, customization, and intuitive feedback for people with disabilities.
- Ethical considerations: Appreciated the responsibility of designing technology that directly impacts vulnerable populations.
What's next for EyeCode: Speak With Your Eyes!
- Improve blink detection and adaptive calibration for varied lighting and eye shapes.
- Integrate with AR/VR headsets, like Meta devices, for hands-free communication and immersive accessibility.
- Add customizable shortcuts for common phrases to speed up communication.
- Expand support for multiple languages and dialects for global accessibility.
- Explore AI-driven predictive text to anticipate user intent and reduce effort.

Log in or sign up for Devpost to join the conversation.