🤟 HandsIn: Building ASL Understanding, Not Just Translation

Inspiration

In 2019, my mom was diagnosed with mild hearing loss. Within a year, her hearing had declined rapidly. Everyday tasks — grocery shopping, doctor visits, even dinners out — became two-person jobs, as she increasingly relied on someone else to interpret for her.

Rather than stay silent, she became an advocate. By sharing her story, she found community and purpose, connecting with others navigating hearing loss. Through her, I became more involved with the deaf and hard of hearing community — and I started listening more.

As I spoke with more people who used American Sign Language (ASL) as their native language, I noticed a troubling pattern: most technology and training tools focus on translating ASL into English, not on understanding or using ASL as a language in its own right.

This isn’t just an academic issue. It has real-life consequences:

  • 🏥 Emergency providers often can't communicate with deaf patients — even for basic needs.
  • 🧓 Elderly homes isolate ASL-native residents by prioritizing spoken English.
  • 🎓 Classrooms sometimes marginalize deaf children by favoring English-based instruction, minimizing their native language and identity.

What It Does

HandsIn is an AI-powered ASL training platform that helps healthcare providers, educators, and first responders learn foundational ASL through interactive practice.

We built an in-browser machine learning model using TensorFlow.js that can classify ASL letters from hand gestures. By making the training flexible, responsive, and culturally mindful, HandsIn prepares users for real-world scenarios where communication equity is critical.

How We Built It

After our first hackathon, where we created a beta version of HandsIn, we knew we wanted to go bigger and better this time. Here's what we did:

  • Rebuilt the entire platform using React.js, designing clean user interfaces for sign-up, login, and dashboards.
  • Developed a hand sign classification model trained on hand landmark data using TensorFlow.js, allowing real-time classification directly in the browser.
  • Designed a structured lesson flow, with phased lessons that teach, test, and reinforce signs interactively.
  • Built custom APIs to integrate the model with our React front end for seamless user interaction.

Accomplishments

  • Fully redesigned the platform from scratch, including a polished user dashboard and authentication system.
  • Implemented a real-time, browser-based ASL alphabet classifier using TensorFlow.js.
  • Created a structured learning flow to guide users through lessons and reinforce understanding.
  • Improved accessibility and design to better reflect the needs of the community we serve.

Challenges We Faced

  • Getting accurate hand sign classification from video data in real-time was a major challenge. We spent significant time tweaking model parameters, improving dataset quality, and optimizing for browser performance.
  • Integrating MediaPipe for hand landmarks and making it communicate smoothly with TensorFlow.js required creative problem-solving.
  • Building secure and smooth user authentication from scratch using Firebase took multiple iterations to get right.

What’s Next

We’re just getting started. Here’s what’s on our roadmap:

  • Interactive avatar that can sign back to the user, reinforcing two-way communication.
  • Expanding our lesson modules to cover common phrases, expressions, and healthcare-specific vocabulary.
  • Partnering with hospitals, nursing schools, and emergency responder training programs to pilot HandsIn in real-world settings.

What We Learned

  • Empathy is the foundation of inclusive design. Building for communities you’re not a part of means listening, learning, and iterating constantly.
  • Real-time AI in the browser is powerful — and possible. TensorFlow.js enabled us to ship ML features without server-side inference.
  • Accessibility isn’t a feature — it’s a necessity. From UX to back-end decisions, we learned that truly inclusive tools must be deeply intentional.

HandsIn is more than a tool — it’s a step toward communication equity. We hope to help providers, responders, and educators not just translate ASL, but truly understand and value it.

Let’s build a world where everyone is part of the conversation.

Built With

Share this project:

Updates