Inspiration

We both have a background in ASL, and wanted to leverage that knowledge to build something awesome!

What it does

Interprets hand shapes and translates them to written English.

How we built it

Hand, finger and joint orientation is read in real time using a Leap Motion. This data is then fed into a neural network that we trained to classify different ASL signs based on the input vectors.

Challenges we ran into

Depending on the ASL sign that is chosen, much of the hand/fingers can be hidden from the view of a single sensor. Using multiple sensors, placed at different angles around the hand would have led to more inputs into our neural network and thus more accurate interpretations.

What we learned

Machine learning, how to use a Leap Motion, more Git experience!

Built With

Share this project:
×

Updates