Inspiration

When one of our team members was in a train terminal this summer, a man approached her and her family to ask for help. However, he was deaf, and this presented a challenge in communication. Thankfully, her sister was fluent in American Sign Language (ASL) and was able to help him.

While ASL is a language option at the high school all of our team members come from, it is not a widely available option for everyone. Because of this, there is a division between the deaf and hearing communities. If a widely available cell phone app was able to help, then there could be a lasting positive impact. Nearly 28 million Americans are Deaf or Hard of Hearing - a lot more common than many people realize!

What it does

Unlike other languages, which rely on sound, ASL is entirely visual. Machine learning is an elegant solution to this problem. By having EyeSpy recognize basic ASL hand gestures (the alphabet and numbers), Deaf and hearing people can communicate with fewer barriers.

The app is meant to be used by a hearing person who is not fluent in ASL, who can download it and then use it to interact with someone who is HoH/Deaf. Possible use cases include a hearing teacher interacting with a HoH/Deaf student, a hearing cashier with a HoH/Deaf customer, or even two complete strangers. By bridging the gap between the hearing and Deaf communities, awesome things can happen.

How we built it

EyeSpy is an application coded in Python that runs with OpenCV. Using OpenCV, the program can recognize the 26 basic letters of the alphabet and show the corresponding letters on the screen.

Challenges

None of our group members have ever attended a hackathon before, nor used OpenCV. Additionally, three of our four team members are freshmen. These factors created some difficulties, from getting OpenCV to initially function on a member's computer to time management. However, we were able to create the hack in the end.

Accomplishments

We're proud of being able to put together the basic features of the app in such a short time period. Being our first hackathon, none of us knew what to expect. However, we put in a good effort and hope to apply the lessons we've learned to our next event. One of our team members also put together a great website, which was an excellent learning experience.

What I learned

We learned lessons about effective time management - both what and what not to do. We also learned about using Python and OpenCV for gesture and object recognition, which is something members of our group may look into in the future.

What's next for EyeSpy

In the future, we would love to port EyeSpy to mobile devices. Having an on-the-go translator in the pocket of every person who isn't fluent in American Sign Language could benefit both the hearing and Deaf communities greatly. We would also like to bring in members of the Deaf community to consult for us and tailor the app in ways that benefit them most.

Built With

Share this project:

Updates