Inspiration

In many real emergencies like accidents, sudden falls, medical distress, the person involved may not be able to unlock their phone or press an SOS button. Most safety apps still depend on manual action, which fails in the moments when help is needed most.SilentSOS was inspired by this gap. I wanted to explore whether a system could detect emergencies automatically, using only the sensors already available on a user’s device, without requiring any interaction.

What it does

SilentSOS is a real-time safety web application that continuously monitors live device sensor data to identify potential emergency situations.It uses motion patterns to detect sudden impacts or prolonged inactivity, audio amplitude to identify distress sounds or extended silence and live location data to share accurate positioning. When multiple signals indicate danger, SilentSOS automatically triggers an SOS, alerts trusted contacts, and displays the event on a live dashboard all without requiring the user to press a button.

How I built it

The project was built as a mobile-first full-stack web application.On the frontend, I used browser APIs such as DeviceMotion, Web Audio, and Geolocation to collect real sensor data in real time. These signals are processed continuously to extract meaningful features like motion magnitude, variance, silence duration, and inactivity time. The backend is powered by Node.js, Express, and Firebase, which handle authentication, real-time data storage, and alert propagation. Emergency events are streamed live to dashboards using Firestore listeners, and trusted contacts are notified automatically. A real-time map view was added using Leaflet to visualize active SOS locations clearly and transparently.

Challenges I ran into

One of the biggest challenges was working within browser limitations. Not all sensors are available on all devices, and permissions must be handled carefully to avoid misleading data. Another challenge was ensuring that the system never displays fake or placeholder values. If real sensor data is unavailable, the app explicitly communicates this instead of simulating numbers. Designing Firestore queries that supported real-time updates also required careful indexing and backend tuning.

Accomplishments that I'm proud of

Built a fully functional automatic SOS system without relying on manual triggers Used only real device sensor data, with no simulations Designed transparent, explainable detection logic suitable for real-world use Created a live emergency dashboard with real-time map visualization Most importantly, the system prioritizes honesty and reliability over exaggerated claims.

What I learned

This project taught us how challenging real-world sensor-based systems can be. I learned the importance of transparency in safety applications, careful permission handling, and designing around platform constraints instead of hiding them. I also gained hands-on experience with real-time systems, Firestore indexing, and building applications that must behave responsibly under uncertainty.

What's next for SilentSOS: AI-Powered Automatic Emergency Detection

Next, I plan to extend SilentSOS into a dedicated mobile application to allow deeper sensor access, background monitoring, and improved reliability. I also aim to refine detection models with on-device learning, add configurable safe zones, and explore integrations with local emergency services for faster response.

Built With

Share this project:

Updates