Inspiration
We were moved by the growing challenge of "aging in place" and the heavy burden placed on remote caregivers. While consumer technology has advanced rapidly, there remains a significant gap in proactive, non-intrusive monitoring for the elderly. We wanted to transform a pair of stylish, consumer-grade AR glasses into a 24/7 "AI Nurse" that provides peace of mind without sacrificing the user's dignity or independence.
What it does
Nana's Helper is an AI-powered monitoring system that leverages the live POV video stream from Ray-Ban Meta Gen-2 glasses. The system identifies activities of daily living, detects high-risk events like falls, and monitors medication adherence in real-time. This data is then fed into a live health dashboard, providing caregivers with structured insights and immediate alerts based on the wearer's direct environment.
How we built it
Our mobile application is built with Swift and connects to the Meta glasses via the Meta AI app/Wearables SDK to stream real-time frames. We integrated the Swift AI SDK to handle scene understanding through OpenAI’s API, moving away from brittle text prompts to a robust, schema-validated data pipeline. The backend is powered by FastAPI and MongoDB to manage patient data, while the caregiver dashboard is a Next.js frontend that utilizes Server-Sent Events (SSE) for real-time telemetry updates.
Challenges we ran into
The primary technical hurdle was managing the trade-off between real-time analysis and API constraints. We had to optimize our sampling rate by showing a live feed and using photo input to maintain a balance between responsiveness and cost-efficiency. Additionally, we initially struggled with the raw LLM outputs and translating that data to our database.
Accomplishments that we're proud of
We feel that we have provided the patient with a minimalist interface that can seamlessly integrate into their lives, without interfering with them. Caregivers can monitor their loved ones and see live updates of their conditions.
What we learned
Through the development process, we gained deep insights into the power of structured AI outputs versus traditional prompting. We learned that for mission-critical monitoring, schema validation isn't just a luxury—it’s a requirement. We also discovered the nuances of POV computer vision, specifically how to handle "scene state" transitions to accurately identify when a user is engaging in a specific activity versus simply moving through a room.
What's next for Nana's Helper
Our next milestone is deploying the system for field testing in real-home environments to refine our fall-detection algorithms. We plan to expand the AI’s capabilities to include automated escalation protocols and historical health trend logging. Ultimately, we aim to integrate real-world medication schedules to create a closed-loop system that proactively reminds users and notifies caregivers of missed doses.

Log in or sign up for Devpost to join the conversation.