Over spring break, a team member experienced personal issues and had to fly home from school after a family member suffered a stroke, leaving them paralyzed and unable to speak. With a personal connection for a user, the inspiration for Butterfl-eye was born.

What it does

Using eye-tracking software, patients select messages (or type a new one with the keyboard) in order to communicate their needs to others. Using the Arduino hardware, patients with motor ability can use the touch sensor to interact with the program. Additionally, the Arduino creates lights and sound to work with alerts to notify others.

This was built using

We used the WebGazer API to format our software for the eye-tracking. Using JavaScript, HTML, Python, and JQuery we we were able to set up the user interface with the functional keyboard. Arduino was used to create physical interaction.

Challenges we ran into

Ideally, we wanted our web app to run through our domain that we reserved. We ran into lots of challenges with Amazon Web Services and were ultimately not able to get our site live. However, is registered by us on Last minute, we had to switch our entire hardware from Arduino to Intel Edison at the very last minute due to hardware issues. THEN, had to switch back to Arduino in the last 10 minutes. Whew.


None of us have ever used any of these interfaces. We created a functional app with little previous knowledge so hey, that's something to be proud of.

What we learned

We learned that we are capable of anything given 18 hours and a whole lot of snacks, ha. We have a group member who is a Physics major and not CS that has little coding experience, and we were able to implement her skills in order to use our hardware. Yay!

What's next for Butterfl-eye

Ideally, Butterfl-eye would be able to recognize a blink or an extended glance as a selection, as opposed to a click. Super ideally, using AI, the messages would become customizable (similar to iMessage predictive text) to understand what users commonly say.

Share this project: