Inspiration
Paralysis and other motor neurone diseases such as ALS can cripple peoples lives and livelihoods in a devastatingly short time.
As the neurodegenerative disease spreads across the brain and spinal cord, the victims diagnosed with it will suddenly realize that most of their muscular activities come to a halt.
All most people are left with is the ability to move their eye-balls freely. A good example of a person who suffered from ALS is Stephen Hawkings.
Hence, we decided to help those suffering with this issue to communicate in plain English by developing an app that tracks the motion of their eyes and navigate the simple 4 panel UI of our app to speak again.
What it does
The app basically tracks the direction in while the eye looks at and this data is used by the app to convert this into an English sentence which can also be converted to speech.
Essentially, The eye-tracker tool we built using Open CV looks for 6 functions:
- Top_left
- Top_right
- Bottom_left
- Bottom_right
- Blink
- Eye Closed
The app screen we built using Android Studio and FireBase is divided into 4 panels:
- Top_left
- Top_right
- Bottom_left
- Bottom_right
How we built it
Front-end
- We decided to use android studio to run the app natively
- Through trial and error we came to an UI that is both simple and easy to understand.
- This intuitive design was the main core to our entire app design.
- We used colour palettes that beautifully match the background and give the app a fluid frictionless look.
Back-end
- We decided to use Firebase for both authentication and storage of data.
- All the required words and sub-words and stored online in a real-time database which can be fetched - at all times to display it.
- We used firebase for authentication of emails and passwords due to its excellent tools for forgetting password, etc.
Eye-tracking
- We used Open CV and Dlib along with its libraries to do eye-tracking.
- We have decided to divide the open CV screen into 4 quadrants and the position of the eye in a quadrant will return a position to firebase.
- This data is then used by the app to choose the relevant panel.
Challenges we ran into
- Making Open CV work. It was our first time.
- Connecting Firebase and making a completely new UI
What's next for Parahelp: gives the power of speech to everyone
- Use more research and better our understanding of the English language to better use the 4 pane layout.
- Use machine learning to essentially collect user data and make accurate per person recommendations for common sentences or phrases
- Better the use of Open CV and eliminate use of uploading/downloading of positions from firebase and make it happen natively on the mobile.

Log in or sign up for Devpost to join the conversation.