We wanted to provide an easy, interactive, and ultimately fun way to learn American Sign Language (ASL). We had the opportunity to work with the Leap Motion hardware which allowed us to track intricate real-time data surrounding hand movements. Using this data, we thought we would be able to decipher complex ASL gestures.

What it does

Using the Leap Motion's motion tracking technology, it prompts to user to replicate various ASL gestures. With real-time feedback, it tells the user how accurate their gesture was compared to the actual hand motion. Using this feedback, users can immediately adjust their technique and ultimately better perfect their ASL!

Alt Text

How I built it

Web app using Javascript, HTML, CSS. We had to train our data using various machine learning repositories to ensure accurate recognitions, as well as other plugins which allowed us to visualize the hand movements in real time.

Challenges I ran into

Training the data was difficult as gestures are complex forms of data, composed of many different data points in the hand's joints and bones but also in the progression of hand "frames". As a result, we had to take in a lot of data to ensure a thorough data-set that matched these data features to an actual classification of the correct ASL label (or phrase)

Accomplishments that I'm proud of

User Interface. Training the data. Working on a project that could actually potentially impact others!

What I learned

Hard work and dedication. Computer vision. Machine Learning.

What's next for Leap Motion ASL

More words? Game mode? Better training? More phrases? More complex combos of gestures?

Alt Text

Built With

Share this project: