Inspiration
I was moved by the book Brain on Fire and the movie Awakenings, which displayed the terror of locked-in syndrome caused by brain inflammation. I realized that patients in neurological crisis lose their voice entirely, which made me want to find a way to offer them a chance at vocal interaction again.
What it does
ClarityBoard is a neuro-assistive communication tool. It helps patients who cannot speak or move their hands to talk using only eyes. By looking at large and colorful buttons for two seconds, the computer speaks the request out loud using text to speech technology.
How we built it
The app is built with React and the WebGazer.js eye-tracking library. I built a 4-corner calibration system for accuracy. I also made a Dwell-Time Selection logic, which calculate intent by measuring how long the user stares at a particular quadrant.
Challenges we ran into
My biggest challenge was Nystagmus, which is the involuntary eye shaking common in brain trauma. Because raw eye tracking data is jittery, it makes it hard to select buttons.
Accomplishments that we're proud of
I’m proud of trying to build a difficult concept like an ocular based AAC (Augmentative and Alternative Communication) and making it work in a standard website.
What we learned
I learned about the Superior Colliculus and how the brain processes involuntary visual reflexes.
What's next for ClarityBoard
The next step is implementing a Moving Average Filter to stabilize the gaze for patients with severe tremors.
Built With
- copilot
- css
- cursor
- html
- java
- json
- typescript
Log in or sign up for Devpost to join the conversation.