🙌 Inspiration
Learning a new skill can often times be, well... boring. How many times have you said to yourself "I'd love to learn about so and so" but then find yourself playing on your phone while watching a tutorial on it? Humans learn best when learning is FUN! When you think about interactive and educational games, the tools that might first come to mind are language learning platforms like Duolingo or Babbel. These tools allow you to play a variety of fun games to learn multiple languages, but what about sign language? Sure there may be games and tools out there that teach American Sign Language (ASL), but they are typically very limited in user interaction and don't actually analyze your hands the same way Duolingo would analyze your voice. That's what inspired Handle!
🕹️ What It Does
Handle provides a twist on the New York Time's phenomenon puzzle game Wordle by using Machine Learning to detect user hand placement on webcam and recognize ASL alphabet signs as input! To play the game simply place your right hand within the green detection box displayed on webcam and pose your desired sign letter. Upon recognition, the game will present the determined letter and hitting SPACE will send it to the game board. Don't know how to pose a specific letter? Not a problem! As clicking on any letter in the on-screen keyboard will provide a demo image of the desired hand sign. From there you get 6 tries to find the secret 5 letter word :)
👷 How It Was Built
Handle is built in ReactJS and uses Google's Machine Learning MediaPipe Hand Detection API. Using the API we are able to take in hand landmark coordinates and determine hand symbols accordingly. Handle is also able to draw hand landmarks to a canvas that overlaps the users webcam display
💀 Challenges Faced
The main problem faced was integrating MediaPipe Hands into a React Project. Most resources and similar concepts seen were done using Python or Android methods and the JavaScript API required adjustments to work with React. The next problem faced involved assigning hand symbols based off landmark data, as certain symbols had noticeable overlap and similarities such as the signs for 'M' and 'N'
🏆 Accomplishments That We're Proud Of
🥳WE DID IT!🥳 While this may not be our first time entering a Hackathon/Competition this is the first time we succeeded in submitting a (relatively) finished product. We defined our standards and MVP from the start and met our goals and timeline! Doing so with conflicting school and internship responsibilities has been a huge personal achievement. It has also been a great learning experience in developing React and Front-End coding skills as well as exploring and applying Machine Learning and API's
🎯 What's Next For HANDLE?
The current representation of Handle is just the tip of the iceberg of what we envisioned. We hope to evolve Handle from a singular Wordle clone to an entire Learning Platform with a wide variety of Sign Language based games and interactive lessons. The more immediate upgrades to add to the current version of handle involve the following:
- Apply motion detection for moving hand signs such as 'J' and 'Z'
- Integrate TensorFlow ML models for more accurate hand sign prediction and eliminate detection box
- Create a wider variety of recognizable hand signs for full words alongside letters
- User Metrics to track learning progress and goals!
Built With
- css
- html
- javascript
- mediapipe
- mediapipe-hands
- react


Log in or sign up for Devpost to join the conversation.