In these times, clear communication is more important than ever before. With over 72 million people in the world being deaf (UN), it is extremely important that those who are hearing impaired get access to the services they need in a safe and comfortable manner. From official communications to accessing small businesses, it is crucial that those who are deaf have a quick and easy aid, in the very likely case that the service or retail workers are unable to communicate with them in sign language.

What it does

Sign vision is an android app that improves communication and accessibility at local businesses and services.

Through Sign Vision, users can record someone performing sign language through their phone while receiving a text translation in real time. In contrast, users may also speak to the phone and will also receive a sign language interpretation of their speech. Finally, if one were to type their text into Sign Vision, they would also receive a sign language translation.

How we built it

For the frontend of the app, we designed the prototype through Figma, and implemented the designs in Android Studio using Java and XML. For our backend we used Node.js and Google APIs. To recognize which sign symbol was being used through the camera, we trained a machine learning model with TensorFlow.

Challenges we ran into

  • Implementing Machine Learning with TensorFlow
  • Our team did not have any prior experience working with machine learning, however it was super fun to explore!
  • Adding the American Sign Language recognition
  • Familiarizing ourselves with Android Studio

Accomplishments that we're proud of

  • Using Machine Learning despite no prior experience and long wait times (8 hours!)
  • Replicating an exact copy of the Figma design in Android Studio.
  • Figuring out how to debug in Android studio with Logcat

What we learned

  • Our team developed a deep appreciation for how powerful machine learning can be. The sign recognition portion of the app would have taken thousands of lines if done by traditional means yet we could train a computer to recognize them over a weekend with machine learning.
  • How to make an Android app and XML code, as well as how to properly use the Android Studio debugger.
  • How to do get and post requests with retrofit, an http client library

What's next for Sign Vision

  • Creating more learning feature that allow users to learn sign language
  • Expand the vocabulary to beyond simply the letters as well as implementing gestures
  • Expanding to iOS devices and to web applications!

Built With

Share this project: