Inspiration

The idea for Guardian hit close to home for both of us. We’ve both seen our grandmothers deal with serious falls down the stairs, and the scariest part wasn't just the injury; it was the helplessness. When an elderly person falls, they often can't reach their phone, let alone unlock it and dial 911. We wanted to build something that bridges that gap, turning the devices people already wear and carry into a safety net that works even when they can't.

What it does

Guardian is a cross-platform safety ecosystem designed for "passive" fall awareness. It’s not just an alarm; it’s a monitor that knows when something is wrong. The iOS app and the paired Apple Watch app work in tandem. Both use high-frequency motion sensors to track impact and stillness. If you fall, the Watch delivers a sharp haptic vibration to get your attention immediately. Instead of just screaming an alarm, the phone talks to you. Using ElevenLabs and Apple TTS, it asks if you’re okay. You can just say "I'm fine" or "Help me", and we use Scribe STT to process that speech, so you never have to touch a screen. If the user confirms they need help (or doesn't respond to the timer), Guardian triggers an emergency flow. The Watch can call services directly via cellular; if there’s no signal, it intelligently hands off the request to the iPhone to blast out the user's location and distress message. We also built a gait analysis tool that scores how someone walks, helping families spot a high fall risk before an accident occurs.

How we built it

We went all-in on the Apple ecosystem using Swift, UIKit, and WatchKit. We used CoreMotion for the heavy lifting on sensor data and K2 Think V2 for reasoning, basically making sure the app can tell the difference between a dropped phone and a genuine fall. We used WatchConnectivity to keep the phone and watch in constant communication, ensuring that if one device detects a fall, the other is ready to act as a backup for the 911 call. We used ActivityKit to build Live Activities for the Lock Screen and ElevenLabs for natural-sounding voice AI. Everything is backed up via Firebase and Firestore for Apple Watch and iPhone, so resolved events are saved for caregivers to check later.

Challenges we ran into

This was our first time ever using Swift, so we spent a lot of time just fighting the syntax. Our biggest headache was definitely CoreML; it refused to play nice for fall predictions, so we pivoted to a custom sensor-reasoning pipeline. We also struggled with the hand-off logic between the Watch and Phone when making sure the emergency call actually goes through. Even without a cellular Apple Watch, it was a massive technical hurdle. Another challenge was figuring out how to connect the Apple Watch to Firebase so all data would be unified. The best we could do was have the Apple Watch report to the database only when the iPhone is reachable from the watch's connectivity stack.

What's next for Guardian

We want to take this beyond a hackathon project. The next step is deeper health integration, leveraging the Apple Watch to monitor heart rate in real time. If we see a massive heart rate spike at the exact moment of a high-impact fall, we can bypass the timer and trigger the SOS instantly. We’re also looking at building a web dashboard so families can monitor gait trends and safety status from anywhere.

Built With

Share this project:

Updates