Inspiration

When someone collapses from cardiac arrest, every second matters. Yet most people freeze because they are unsure how to help. Only 20% of Americans are current with CPR training, and even fewer feel confident enough to act under pressure.

We created Pulse to bridge that gap. Instead of relying on memory or scrambling to find a guide, users can receive calm, step-by-step CPR instructions directly in their field of view. With augmented reality through Snap Spectacles,Pulse empowers anyone to take immediate and effective action during an emergency.

What it does

Pulse is an AR application for the Snap Spectacles that provides step-by-step CPR guidance in real time. It has two main components: real-time CPR guidance and a map displaying nearby AEDs for quick retrieval.

The CPR guidance component uses Snap Spectacles’ AR capabilities to walk users through each critical step of administering CPR. It directs users on where to place their hands, provides a rhythm to follow for compressions, and measures compression, ensuring accurate and effective compressions.

Our AED Map locates nearby registered AED units and displays their locations and approximate distances in the AR world. We have also implemented voice recognition into our map, allowing for hands-free viewing and closing during hectic emergencies.

How we built it

We built Pulse using Lens Studio for Snap Spectacles, integrating pose detection using [David fill this out]. We used AR overlays to project correct hand placement on the victim’s chest and created a rhythmic visual + audio system to guide compressions. Sensor data from Spectacles helped us track hand movement and force. We also used a voiceML module to integrate voice recognition into our AED locator, allowing for hands-free retrieval.

Challenges we ran into

GPS Coordinate Precision - We wanted to accurately track AED Locations through GPS, but limited precision of our devices made it difficult to pinpoint exactly where the locations are in real life. We eventually got the AED locations in Pauley Pavilion to be precise within the augmented world, but everywhere else may lack precision.

We also ran into the challenge of having to work in a completely new environment as before LA Hacks, none of us had worked with Lens Studio or Snapchat Spectacles. Hence there was a need to adapt and work through things we might have not otherwise come across if we had chosen to make a project with a more traditional tech stack.

Accomplishments that we're proud of

Working with Lens Studio and Snap's Spectacles was definitely the highlight of the hackathon. It was exciting to work with new technology (especially VR/AR) and try to put our product together in such a short amount of time. We're proud of how many features we managed to use from the model, the AR components, the visual sensors, and the auditory sensors were all a part in our application.

Overall, it was really cool to be able to have worked with Lens Studio and Spectacles and persevere through the challenges. For some of us, this was our first LAHacks, but for all of us, it will be our last :(. This project is the culmination of everything we've learned during our time at UCLA and all the skills we picked up along the way. Regardless of the outcome, this has been a fun event full of innovation, from us and the people around us.

What we learned

First, this was our first time interacting with Snap Spectacles and Lens Studio. It was a challenge wrapping our heads around the technical components, but it was also rewarding when we figured them out and could see (literally) what we created. Interacting with the components of the lens was also a fun challenge as there were so many possibilities that could arrive with AR, so choosing which features to include was difficult.

What's next for Pulse

We can't really develop without Spectacles, but we hope this project. can inspire to transcend the limitations of the physical world and interact with the augmented one.

Built With

Share this project:

Updates