Inspiration

One of our teammates had a grandmother with Parkinson's, and towards the end, she had difficult moving at all and speaking. The nurse said the main way they communicated was with simple blinks- once for yes and twice for no. "Eye Language" expands on this form of communication by allowing patients to do simple tasks by looking at different quadrants of a screen.

What it does

We created eye tracking software that can detect which direction you are looking at. If you look in any of the four corners of your vision, you can perform actions such as calling for a nurse, opening the curtains, turning on lights, and even expressing your love!

How we built it

We're running two main files. The python script runs our eye tracking software and plays sound effects. Our arduino script processes the eye tracking data and performs the actions. We used an arduino to control two servos for an automatic curtain as well as to turn on and off a series of LEDs.

Challenges we ran into

The eye tracking library we used was very difficult and buggy to work with. We had to run multiple iterations of testing and and calibrations to get consistent results. By nature of our framework, we had to manage communications between our python script and arduino script with little delay.

Accomplishments that we're proud of

We were able to complete the hardware for four actions and have a project that works surprisingly well!

Built With

Share this project:

Updates