Why Halo Matters
In a world where medical emergencies strike without warning - in classrooms, gyms, kitchens, streets - Halo transforms ordinary people into confident first responders by giving them the knowledge, clarity, and courage to save lives.
We built Halo so no one ever has to stand frozen in those critical first minutes. Watch our videos to see how Halo works in real life scenarios!
Inspiration
Last year, our mom - a Type 1 diabetic for over 18 years - suddenly collapsed into a near-fatal episode of Diabetic Ketoacidosis (DKA). The doctors later told us what saved her life wasn't just medicine - it was our quick action in those first 5–10 minutes: recognizing what was happening & knowing exactly how to respond.
We realized that no one should ever feel helpless in a medical emergency - whether it's a family member, a classmate, or a stranger - simply because they don't know what to do or don't feel confident doing it. Consider this:
- 1 in 13 children in the U.S. has a food allergy. Every 3 minutes, someone rushes to the ER for an allergic reaction.
- 350,000+ cardiac arrests happen outside hospitals annually in the U.S. - yet less than half of victims receive bystander CPR.
- Strokes and seizures strike in everyday spaces - schools, offices, sports fields - where trained professionals aren't present.
Fast, accurate action saves lives. Hesitation costs them. Halo uses empathetic Voice AI and Spatial Augmented Reality to turn bystanders into lifesavers.
What We Built + How It Works
Halo is a Swift-native iOS application for iPhone and iPad that fuses Augmented Reality (AR) with an empathetic AI voice agent. We didn't build Halo for a distant future - we built it for real-world deployment today, running on the iPhones and iPads that students, teachers, parents, and bystanders already carry in their pockets & bags. No bulky headsets. No specialized hardware. Just immediate, accessible emergency response technology available to anyone, anywhere.
Halo leverages ARKit's body tracking and Apple's Vision Framework + CoreML capabilities to detect and map skeletal joints in real-time, creating a complete digital skeleton overlay of the person needing help. This isn't approximate positioning, we wanted to provide anatomically precise guidance that adapts dynamically as the patient moves. We've mapped specific skeletal joints to critical medical interventions: the spine7Joint (seventh thoracic vertebra) becomes the anchor point for optimal CPR chest compression placement, ensuring compressions hit the correct depth and location on the sternum. The rightUpLeg and leftUpLeg joints define the exact EpiPen injection zones on the outer thigh, eliminating the dangerous guesswork that comes with anaphylaxis panic. Limb joints provide precision tourniquet positioning to control severe hemorrhaging without causing unnecessary tissue damage. Using RealityKit's spatial anchoring system, we've created 3D guidance overlays that remain locked to the patient's body even as they move, breathe, or shift position. These aren't static projections but dynamically anchored AR entities that track skeletal movement frame-by-frame, maintaining medical accuracy regardless of environmental chaos.
Some of our current protocols include: Anaphylactic shock → Halo projects the exact EpiPen injection site onto the outer thigh, eliminating guesswork and ensuring proper administration when the airway is closing. Cardiac arrest → AR overlays display precise hand placement on the chest with visual compression rhythm indicators, guiding bystanders through effective CPR. Severe bleeding → Halo visualizes tourniquet placement zones and arterial pressure points directly on the injured limb, helping untrained responders apply hemorrhage control. Stroke assessment → Real-time FAST protocol visual cues (Face, Arms, Speech, Time), highlighting facial symmetry indicators and arm position checks to identify strokes in the critical golden hour.
Why Voice & Vision - AR provides visual precision; AI voice provides emotional grounding. AR alone can't calm someone whose hands are shaking. Voice alone can't show them exactly where to place those hands. Together, they create a human-augmentation system that doesn't replace human judgment - it amplifies it in the moments that matter most.
ElevenLabs
Most emergency apps give you text on a screen. Halo gives you something no other crisis response app has: a voice that becomes your lifeline. Like a friend you can count on. We didn't just need text-to-speech. We needed a human connection in the most inhuman moments. A voice that could cut through sheer panic and transform a terrified stranger into a confident first responder. That's why Halo is powered by ElevenLabs Voice AI. Using ElevenLabs' eleven_multilingual_v2 model integrated through the Python SDK, Halo delivers something unique: context-aware, emotionally intelligent voice guidance. This isn't static audio playback. The voice dynamically switches between tones - calm, reassuring and instructional - to meet users exactly where they are emotionally. Watch our demo videos to hear the Voice AI Agent in action!
Cloudflare AI
Halo Companion doesn’t just respond to emergencies - it knows YOU. Like Cursor or GitHub Copilot understand the context of your codebase, we wanted Halo Companion to understand your unique health context to deliver personal, life-saving guidance. Built as a full-stack SvelteKit app on Cloudflare Pages, it plugs into your workflows (e.g. Notion) so users can ask, “What was my last meal and how much insulin did I take?” and get instant, relevant answers - also grounded in thousands of indexed medical papers. TypeScript powers the frontend and server routes, which interface with Cloudflare Workers AI. We use LLaVA-1.5-7B-HF for rapid image understanding (e.g., “What are these insulin pens?”) and FormData API for secure file transfer. Our personal health context engine - a “Cursor for health data” - retrieves critical info with minimal latency.
Challenges we ran into
- AR Calibration for Different Body Types: Aligning EpiPen and CPR overlays precisely required custom anchor tuning and Apple ARkit skeleton tracking tweaks. This could easily vary based on lightning, positioning and several other factors we had to account for.
- Low-latency voice prompts without hiccups when interacting with the Halo Voice Agent.
Accomplishments that we're proud of + what we learned
This is just the beginning - a working prototype that shows how accessible AR and Voice AI can bring emergency response training to life. We’re proud to have a proof of concept for using immersive, easily deployable tech to augment knowledge in entirely new ways!
What's next for Halo
School / Corporate Deployment Programs
Bringing Halo into classrooms and workplaces to run AR emergency simulations for CPR, anaphylaxis, and the stroke FAST test - making emergency preparedness engaging and memorable. We can partner with organizations already teaching medically accurate first aid, such as the Red Cross and CPR.heart, to expand impact at scale.Multilingual & Accessibility Expansion
Adding real-time translations and enhanced accessibility features to support diverse communities and ensure that life-saving guidance is available to everyone.





Log in or sign up for Devpost to join the conversation.