Inspiration

Current sign language learning apps fall short. They teach signs, but they don't have personalised feedback loops that promote substantive improvement. HandInHand does both- it teaches sign language in an interactive fashion that includes the user as part of the learning experience.

What it does

HandInHand is an interactive sign language learning app that teaches ASL through gamified lessons, using AI-powered motion analysis to provide real-time feedback on signing accuracy and technique.

How we built it

Built with React Native for cross-platform compatibility, integrated Google's Gemini API for motion analysis, implemented MediaPipe for hand tracking, and designed a progressive lesson system with video demonstrations and practice recordings.

Challenges we ran into

Optimizing API call timing for smooth user experience, ensuring accurate hand gesture recognition across different devices and lighting conditions, and designing an intuitive UI that accommodates users with varying technical abilities.

Accomplishments that we're proud of

Successfully implemented real-time motion comparison between user recordings and ASL expert demonstrations, created an engaging gamified learning progression system, and built a fully functional cross-platform app in a short timeframe.

What we learned

Gained expertise in AI integration for educational applications, learned about accessibility considerations in app design, and discovered the complexity of accurate gesture recognition and motion analysis.

What's next for HandInHand

Expanding the curriculum to include advanced ASL concepts, implementing social features for community learning, adding support for other sign languages, and partnering with deaf education organizations for content validation.

Built With

Share this project:

Updates