Inspiration

We were inspired by the rise of new advanced hand motion technology such as Google's Project Soli. We wanted to use this new technology in a way to help those who require assistive technology and improve mobile experience for these users.

What it does

An Augmented Reality tool that detects American Sign language and the ASL manual alphabet. The tool attempts to convert hand gesture movements into letters of the English alphabet, which can then be combined into words. The tool can be used as assistive technology for those who are hearing impaired.

How we built it

We designed it with Android Studio using bazel in a Linux environment. Most of the dynamic hand tracking is using OpenCV with Python which is then loaded unto an android app, which uses the phone's GPU/processing power.

Challenges we ran into

One major challenge we ran into was setting up an environment for model training in the Google Cloud Shell, which ended up taking a significant amount of time to learn and attempt to implement.

Accomplishments that we're proud of

We are proud of being able to set up the technology for detecting hand movements on an Android phone.

What we learned

We learned about mobile app development through Android studio and how to use Google's AutoML vision tools for setting up an environment for machine learning

What's next for Handslation

We hope to continue working on a full implementation of the project. We also hope to implement languages other than ASL, such as Chinese sign language. We hope to further improve this tool's value as assistive technology by finding ways to improve its usefulness to those who are hearing impaired.

Share this project:

Updates