Inspiration

One of the hardest moments in a person's life is watching a loved one grow old. There's an uneasy feeling knowing that someday they may need us, and we might not be there. Assisted living costs Americans $6,200 a month on average; many families enroll their elders simply for peace of mind. Everyone deserves that peace of mind. It shouldn't cost $74,000 a year.

What it does

LifeLens uses emerging technology to monitor seniors for potential emergencies passively and privately. Seniors simply wear their glasses as they go about their day. LifeLens analyzes the footage locally on-device, and if everything looks safe, it's deleted immediately. If something concerning is detected, it's escalated to their designated emergency contact. No cloud. No storage. No privacy trade-off.

How we built it

LifeLens combines Meta Ray-Ban glasses with the Asus Ascent GX10 server to create a seamless, private monitoring pipeline. Our iOS app is built in Swift on top of the Meta Wearables SDK, integrating with the Meta AI app to livestream footage directly to the GX10. The local model processes the stream in real time; no footage is ever stored. If an emergency is detected, the stream is shared with the designated emergency contact through our web portal, built with React and HTML/CSS. Everything stays local until it matters.

Challenges we ran into

Developing on the Meta Ray-Ban came with significant SDK limitations. The Meta Wearables Device Access Toolkit (MWDAT) restricts what data can be accessed programmatically, forcing us to architect around its constraints rather than building the most direct solution. On the iOS side, we navigated Swift 6 strict concurrency errors and duplicate Info.plist conflicts.

Accomplishments that we're proud of

We're proud to have built something meaningful with hardware that was never designed for it. The Ray-Bans were built for social media, but we reimagined them as a safety tool for the most vulnerable members of our communities. We're also proud that LifeLens solves a problem we've seen firsthand. This wasn't just a hackathon project; it was personal. Building something technically challenging that could genuinely improve lives and preserve dignity made every debugging session worth it.

We have a fully functional Apple app that allows real time analysis using ai securely.

What we learned

Building LifeLens taught us how to architect across hardware, mobile, and server environments in ways none of us had done before. We deepened our understanding of Swift, the Meta Wearables SDK, and the tradeoffs that come with local vs. cloud processing. Most importantly, we learned that security isn't just a feature; it's a design philosophy that the people we care about deserve. Every decision we made, from how footage is streamed to how it's deleted, was shaped by that principle.

What's next for Life Lens

Our goal for the future is to focus on expanding the AI's detection capabilities to cover a wider range of medical emergencies: seizures, heart attacks, choking, asthma attacks, and allergic reactions so that no critical moment goes unnoticed. We want LifeLens to serve seniors at every stage of vulnerability, not just the most independent. Long term, our goal is for LifeLens to become the go-to platform for non-invasive senior monitoring, a tool that families trust, seniors don't mind wearing, and first responders can count on.

Built With

Share this project:

Updates