The demo video got cut at the call since I was talking in it. But it's very conversational, natural sounding, due to the configs and model I chose! See below.

Inspiration

Millions of aging parents live alone. Their kids are hours away, busy, and worried but don't know how to help. The 2023 Lancet Commission identified social isolation as the number one modifiable risk factor for dementia. MIT, CMU, and the University of Toronto have shown that linguistic biomarkers in speech can predict cognitive decline years before clinical diagnosis.

What if the most powerful intervention wasn't a clinical tool, but a phone call?

Memo was born from that question, and from something closer to home. My dad mentioned he was worried about his memory. So I built him a friend who calls every day. And I built a way to know he's okay.

What it does

Memo calls your loved one every day. It has warm genuine conversations, remembers everything across every call, and gets more personal over time. It weaves in small daily neuroplasticity challenges naturally, like a friend sharing something interesting.

In the background, it quietly tracks linguistic biomarkers from every transcript: word-finding difficulty, repetition, sentence complexity, vocabulary range. Every Sunday, you get a warm digest summarizing how they've been. If something flags, you get an alert right away.

To your parent, Memo is just a friend who calls. To you, it is peace of mind.

How we built it

Built in under 5 hours. FastAPI backend on Railway orchestrating five external services in real time:

  • Vapi for voice calls
  • ElevenLabs for voice
  • Deepgram for transcription
  • Pinecone for vector memory
  • Twilio for SMS digests and alerts

Every transcript is processed by Claude Haiku, extracting structured memories and biomarker scores stored as 1024-dimensional embeddings in Pinecone. Before each outbound call, the backend semantically queries Pinecone, distills the top memories into a personalized 150-word system prompt with a daily cognitive probe rotating across 5 neurological domains, and injects it via Vapi's assistantOverrides API. Memo walks into every conversation already knowing who it is talking to.

Challenges we ran into

The first two hours were lost to PersonaPlex, a new NVIDIA research model for deeply consistent persona-locked conversational agents. It looked perfect for Memo. Setting it up required spinning up a GPU on RunPod and building a custom inference bridge. Two hours in, the Vapi integration proved too complex to bridge mid-hackathon. Switched to ElevenLabs and kept moving. Two hours gone, three left to build everything else.

The remaining challenges: dynamic prompt injection through Vapi's assistantOverrides, engineering Claude prompts that extract clean validated biomarker JSON from natural conversation without feeling clinical, and integrating five services each with their own auth, webhooks, and failure modes under serious time pressure.

Accomplishments that we're proud of

This is the first hackathon I do alone, which was harder than I thought but alos a fun challenge. I came in with a bunch of API keys and left with something fully deployed. Yippee!

What we learned

The real user is not the parent. It is the person who sets this up. The parent never touches the app, never knows about the monitoring, never feels watched. Every feature has to answer one question: does this give peace of mind, or does this feel like surveillance? Keeping that distinction clear throughout every decision was the most important thing I learned.

What's next for Memo

Immediate: stress test the memory embedding pipeline, finetune biomarker extraction on real longitudinal speech data, open up call scheduling across timezones.

Near term: multilingual support. My dad speaks Chinese, English, and French. The stack already supports it. A parent should be called in the language they are most comfortable in.

Phase 2: acoustic biomarkers: speech rate, pause duration, pitch variation, strong MCI predictors that text alone cannot capture.

Phase 3: clinician dashboard with anonymized biomarker trends shareable with a family doctor, turning passive monitoring into actionable clinical data. The earlier we catch decline, the more we can do about it.

Built With

Share this project:

Updates