Inspiration

Two of our team members have hard-of-hearing and deaf grandparents. Plenty of people are passionate about learning ASL to bond with family and friends or they just have a general curiosity of the subject. The ASL Translator Hologram makes learning ASL fun and intuitive, so as to make it easier for us to understand our family members.

What it does

The ASL Translator Hologram is broken into two parts: an image recognition model for identifying hand gestures as they relate to American sign language and a 2D hologram that can display the live translations from the model. Paired together our ASL Translator Hologram is capable of teaching ASL learners much faster without the need of two speaking/hearing partners.

How we built it

The AI Model was built from a foundation of another similar project from years ago. The model is a convolutional neural network that was trained on thousands of photos of different ASL hand gestures. The Hologram was built with an antique motor (what we had available) and a hand soldered circuit board with 48 LEDs total. They interface over wifi with an ESP8622.

Challenges we ran into

The three biggest challenges were that the hologram technique we are using is a bit dangerous, needs very fast computation, and the model needed to be trained on millions (not thousands) of photos. The motor must spin at least 3600 RPM with two blades. If the blades were 2 meters across the tips would be traveling at over mach 1. The second issue was that running a display at 3600 hertz is difficult, but we also don't have microcontrollers with the necessary pins for this projects. This meant we needed to do multiplexing which further increased the computation limitations. We were not able to solve the last problem within this hackathon, but it was an impressive feat that our group was capable of making a working ASL recognition model within 3 days.

Accomplishments that we're proud of

We are very proud of how well the ASL hand gesture recognition worked. Additionally, the motor was kept at a smaller size to prevent accidental injuries (an explosions) which allowed us to make some beautiful symmetric displays with the LEDs before we finished the code.

What we learned

The main concepts we learned were primarily in relation to the model and the hologram. We learned from the model that having a large image base and processing power. Connecting the Hologram to the model was a big hassle, so we decided to host a webserver locally on the ESP8266 to allow seemless communication between the computer and the microcontroller. Another big limitation of the AI model we used was that it primarily used edge detection which can confuse the model when trying to differentiate between different hand gestures that look very similar. Lastly, the extent to which the motor needs to run is a large limitation in the holograms ability to display good images.

What's next for ASL Translator Hologram

The next step is shrinking the motor (we could buy a motor from this century), using a much larger data set based on hand tracking instead of edge detection (that we'd preferably make ourselves), using a much faster processor, and having a higher resolution hologram which could possibly be a transparent display (much safer). All of these good reduce the total price of the product while simultaneously increasing its effectiveness.

Tracks: Educational and Peratons

Built With

Share this project:

Updates