🧠 Inspiration

We realized that most SOS apps assume users can speak, hear, or use their hands β€” but that leaves out millions of people with disabilities. What if someone is non-verbal, blind, or paralyzed and needs help? SilentSOS was born to make emergency response accessible to all, regardless of ability. Safety should never depend on physical capability.


πŸ’‘ What it does

SilentSOS is an inclusive emergency response mobile app built for people with disabilities. It lets users silently send SOS signals to trusted contacts or authorities through voice commands, screen reader support, gestures, eye movement, or single-tap triggers. The app shares real-time location, health details, and emergency type without requiring speech or text input.

Key features:

Emergency trigger via gesture, voice, or single button

Auto-send SOS with live location and predefined messages

Screen reader-friendly UI and voice output

Discreet mode with fake interface for unsafe environments

Support for hearing, speech, visual, and mobility impairments


πŸ—οΈ How I built it

We used React Native (Expo) for cross-platform mobile development with a strong focus on accessibility. Here's the stack:

Frontend: React Native + TypeScript, with accessibility APIs (VoiceOver, TalkBack)

Backend: Node.js + Express with PostgreSQL (hosted on Render)

Real-time: Socket.IO for live emergency updates

Communication: Twilio API for emergency SMS/calls

Voice tech: Google Cloud Text-to-Speech and Speech-to-Text

Authentication: Clerk for secure login

Hosting: Render (backend), Expo for mobile build and OTA updates


🚧 Challenges I ran into

Designing a UI that’s equally usable for visually impaired and physically limited users

Integrating gesture recognition and screen reader compatibility in a single app

Ensuring instantaneous trigger actions with minimal device resource usage

Creating a discreet SOS method that looks like another app to avoid detection in unsafe situations

Balancing offline fallback support and cloud sync in emergencies


πŸ… Accomplishments that I'm proud of

Built a fully functional, disability-inclusive emergency app in limited time

Designed for real accessibility β€” not just labels and colors

Enabled SOS with zero verbal or written interaction

Created a silent trigger system for high-risk situations

Got great feedback from disability-focused forums and early testers


πŸ“š What I learned

Deep understanding of WCAG and accessibility-first design

How to integrate assistive technologies like screen readers and voice interfaces in mobile apps

Building real-time, secure, and lightweight emergency communication systems

How small UI/UX decisions can massively impact people with disabilities


πŸš€ What’s next for SilentSOS

Integrate AI chat assistant to guide users during panic attacks or emergencies

Add support for eye-tracking / head gestures on advanced devices

Work with NGOs and accessibility experts to refine the UX

Launch a community-safe zone feature to show nearby police stations or volunteers

Share this project:

Updates