Inspiration

We noticed that people with hearing, speech, or cognitive challenges — and women in distress — often juggle multiple apps to communicate or get help. Sign-to-text apps exist, captioning apps exist, and safety apps exist, but nothing integrates all these features into a single, privacy-first platform. We wanted to create something inclusive, simple, and life-changing.

What it does

BridgeVoice is a mobile-first app that: ~Translates sign ↔ speech ↔ visual cards in real time. ~Offers autism- and elder-friendly modes with calm UI, larger text, and structured visuals. ~Includes a built-in women’s safety button for discreet emergency alerts with location sharing. ~Works offline using on-device AI for privacy. In short, it’s like Google Translate but for accessibility & safety.

How we built it

For Deckathon, we created a detailed pitch deck, app mockups, and an architecture plan for BridgeVoice. Our planned stack includes Flutter for the UI, TensorFlow Lite & MediaPipe for sign recognition, Google ML Kit for speech-to-text, and Firebase for user data. The MVP development will follow this architecture.

Challenges we ran into

~Balancing feature ambition vs. hackathon time limit — deciding which features to mockup and which to leave for future builds. ~Designing a UI that is simultaneously friendly to deaf, autistic, elderly, and women without overwhelming the screen. ~Finding or training a sign-recognition model that’s accurate yet lightweight enough to run offline.

Accomplishments that we’re proud of

~Conceptualized a multi-need accessibility app that no other product currently offers. ~Created clear mockups and user flows showing real-world scenarios. ~Designed a privacy-first architecture with offline AI. ~Integrated a safety feature within an accessibility tool — a first-of-its-kind combination.

What we learned

~Accessibility design requires deep empathy and user testing. ~Simple visuals communicate better than text-heavy slides. ~Combining multiple existing technologies (speech recognition, TTS, sign detection) can create new and powerful experiences. ~Hackathons reward clarity + impact over complexity.

What’s next for BridgeVoice

~Prototype the sign-recognition model and optimize it for mobile devices. ~Build a community-driven sign library where users can upload new gestures and regional signs. ~Integrate with wearables (smartwatches, AR glasses) for hands-free communication. ~Partner with schools, clinics, and NGOs to pilot the app with real users. ~Add micro calming tools (breathing guides, vibration cues) to help autistic or anxious users manage stress during interactions.

Built With

  • and
  • and-outlined-the-tech-stack.-the-prototype-is-planned-with-flutter
  • created-mockups
  • deck
  • during-deckathon-we-designed-the-user-flows
  • firebase.
  • for
  • our
  • shows
  • tensorflow-lite
  • the
  • this
  • vision
Share this project:

Updates