Inspiration
In 2024, news stories highlighted tragic incidents where deaf and hard-of-hearing individuals were unable to evacuate during emergencies because they could not hear fire alarms, evacuation orders, or sirens.
One case involved an apartment fire where residents who could not hear were left unaware until it was too late.
Traditional emergency systems heavily rely on sound-based alerts, creating a dangerous communication gap for millions worldwide.
We were inspired to create HearMeOut — an app that empowers the deaf community with real-time emergency awareness, using their smartphones as smart sensors.
What it does
HearMeOut listens for emergency-related sounds using the device's microphone.
When it detects critical keywords such as "fire," "earthquake," or "evacuate" through live speech-to-text AI, it immediately vibrates the phone, flashes the flashlight, and displays a clear on-screen alert.
This ensures that deaf and hard-of-hearing users are informed instantly, even without hearing alarms.
How we built it
We used React Native for cross-platform mobile development, Supabase to securely log emergency events, and Google Cloud Speech-to-Text API for real-time transcription.
The app is designed with accessibility and speed in mind, triggering alerts within seconds of hearing a critical announcement.
Challenges we ran into
Balancing real-time performance with battery efficiency was challenging.
We had to optimize microphone recording without overwhelming device resources.
Handling variations in speech quality, background noise, and platform-specific flashlight permissions (especially on iOS) also presented technical hurdles.
What we learned
Building for accessibility requires real-world testing and user empathy.
We gained deep insights into mobile audio handling, speech processing, and cross-platform device control.
We also learned how to make AI solutions lightweight enough to run reliably on everyday smartphones.
What's next
Future updates could integrate location-based disaster alerts, wearable device support (like smartwatches for better haptic feedback), and smarter AI models to filter false alarms.
We also envision collaborating with official emergency systems to offer verified, real-time alerts city-wide.
Built With
- api
- expo.io
- react-native
- react-native-torch
- speech-to-text
- supabase
- typescript
- vibration-api
Log in or sign up for Devpost to join the conversation.