There aren't many resources online to learn ASL (American Sign Language). We wanted to make a tool, inspired by Duolingo, to make ASL education more accessible to language learners. We also wanted to make it an interactive learning experience using a LeapMotion device, a relatively inexpensive device for 3D motion and gesture control.

What it does - The tool shows you new words and phrases, then invites you to try it out using a LeapMotion. It provides feedback whether the correct gesture was detected, and the lesson continues to build phrases using words that you've learned. (Dev hack: use arrow keys to skip to the next word)

How we built it

Flask, Backbone

Challenges we ran into

The technology we used has its limitations - its sensors can't always detect things hidden from view (e.g. your fingers moving while your palm is facing away from the device), so it makes its best guess. The VR community is growing rapidly, and we're hoping to eventually support accessories with more advanced gesture recognition.

What's next for Arcsign

We'd like to make it easier to crowdsource lesson creation, by making it easier to train gestures using recognition strategies such as algebraic cross-correlation and neural networks. We didn't think this was feasible with the technology we used, but it might be doable with more reliable input like a glove with sensors.

Share this project: