Inspiration

We were inspired by Leap Technology. We saw the power the leap sensor could do by tracking hands, and thought we could create an AI to teach people sign language.

What it does

With the help of cutting edge hand tracking and virtual reality technology, ASL VR provides a truly immersive learning experience fit for the 21st century. Instead of flipping open a dusty textbook or trying to read from your phone screen, enter our simulated classroom and receive instant feedback on your learning progress with Leap motion tracking.

How we built it

Our Unity VR project uses Leap motion tracking to plot points on your hands. We spent a lot of time nailing down the accuracy of each hand gesture and measuring each letter's accuracy to our training data.

Challenges we ran into

HOURS were spent on finding the right mixture of angles between points, distance between points, which points to focus on, and how to retrieve accurate relative data.

Accomplishments that we're proud of

We persevered through the repetitive task of training and retraining each letter every time we updated the gesture detection, and found an accurate way to differentiate each sign.

What we learned

Sign language of course, but also how to cross reference data to calculate accurateness.

What's next for ASL VR

Expanded vocabulary, velocity detection, and a tutor system.

Share this project:
×

Updates