Inspiration

Duolingo is one of the most popular language-learning platforms, yet it lacks an option to learn sign language. This felt like a major gap—ASL (American Sign Language) is a crucial form of communication, and accessibility to learning it should be just as widespread as spoken languages. We saw an opportunity not only to build an AI-powered tool but also to promote inclusivity by making sign language education more interactive and engaging.

What it does

Signly enhances ASL learning by offering real-time feedback on hand signs, making practice sessions more interactive and effective. Here’s how it works:

  • Recognizes ASL letters (A-Y) instantly, helping users refine their signing accuracy. (J & Z are excluded due to their motion-based gestures.)
  • Captures hand gestures and extracts key hand landmarks using MediaPipe, ensuring precise recognition.
  • Provides instant feedback through a trained MLP Neural Network (MLPClassifier) that predicts the signed letter based on processed images.
  • Gives user a learning track and XP points. Track: Education 📚

How we built it

We started by capturing images of ASL hand gestures and extracting their landmarks using MediaPipe. From there, we trained an MLP Neural Network using Scikit-Learn, testing its accuracy and refining it using Postman. For the backend, we built a Flask API that handles incoming images, extracts landmarks, and makes predictions. On the frontend, we developed an iOS app using Flutter and Dart, which captures hand images, sends them to the Flask backend, and receives the recognized letters for real-time display.

Challenges we ran into

One major challenge was differentiating between visually similar letters, such as O and C. We tackled this by expanding our dataset to include more diverse hand poses, improving the model’s ability to distinguish subtle differences. Another challenge was connecting the app to the backend, which led us to cloud-host our API for a more stable deployment.

Accomplishments that we're proud of

  • Developed a live ASL recognition system that accurately perceives hand gestures using a robust pose dataset.
  • Deployed our computer vision model via Flask, making ASL learning more accessible and interactive.
  • Integrated our model with the app to provide live feedback and a rewards system.

What we learned

  • How to train machine learning models on highly varied image datasets.
  • Debugging Flask API and video processing issues in a real-time application.
  • The importance of scope management when integrating ML models with frontend and backend development.

What's next for Signly

  • Expansion to Android support
  • Streaks, learning modules, and support for longer words/phrases
  • Multiplayer mode: challenges between friends with games to encourage collaboration in learning ASL

Built With

Share this project:

Updates