Inspiration

TreeHacks fell on Valentine's Day. One of our teammates spent the entire week fighting with his girlfriend about it. The same argument kept resurfacing in different forms. Over text, at dinner, in the car. By Friday he couldn't even remember what they were actually fighting about. Just a heavy foreboding feeling and the sense that it was going to happen again.

That's the problem. People fight and can't even pinpoint why they're fighting. You can't fix a pattern you can't see. And you can't see it because you're always inside the conversation, never above it. There's a gap between what you felt and what actually happened. We built Third Party to fill that gap.

What it does

ThirdParty runs in the background on your phone or Meta glasses, capturing conversations and turning your day into a private relationship timeline. It transcribes and breaks interactions into moments, tags tone shifts and flags important points by aligning them with wearable stress proxy spikes. It also saves who you were with using facial recognition that links moments to contacts, so you can see patterns per relationship, not just per day.

Why always-on recording?

We know always-on recording sounds uncomfortable. But every generation resists the next layer of self-awareness, then can't live without it. Your phone already tracks every step you take and every place you go, but the most important part of your life, your relationships, has zero data. The next generation won't think twice about having a conversation archive the same way nobody thinks twice about a photo library today. We're just building it first.

How we built it

Always-on relationship capture is sensitive, so the biggest challenge was designing for trust: clear consent, minimizing what gets shared, and keeping the “shared session” partner-safe. We also had to handle noisy signals, because wearable stress proxies spike for lots of reasons. On the technical side, segmenting messy real conversations into moments that feel accurate, then keeping the journaling prompts calm and nonjudgmental, took a lot of iteration.

Challenges we ran into

Always-on capture is sensitive, so we focused on trust: consent, minimizing what gets shared, and keeping shared sessions partner-safe. Signals are also noisy, so we decided that stress spikes have to treated as places to pay attention to and not “this caused that.”

Our biggest technical blocker was the people tab. It took a while to get facial recognition and speaker diarization working together reliably so moments attach to the right person, especially with imperfect audio, interruptions, and overlapping speech.

Accomplishments that we're proud of

  • Continuous audio capture + image recognition + live conversation parsing that structures your day automatically, no manual journaling required.
  • A visual, scrollable map of your day where each interaction becomes a “relational bubble.”
  • Bubble size & temperature reflect conversation duration, discourse intensity (mock cortisol / heart rate signals), AI-scored meaningfulness, and emotional energy inferred from tone
  • Every person has a living profile with a full archive of past conversations, typical topics & emotional patterns, interaction frequency & tone trends, editable labels, and custom photos/notes.
  • Simulated physiological overlays (cortisol/heart-rate proxies) demonstrate how real biometric hardware could quantify escalation and calm over time.
  • A UI that feels quiet and human, not clinical

What we learned

Most people do not need a verdict after a conflict. They need structure. When you can slow down, separate facts from stories, and name the underlying need, repair gets easier. We also learned that patterns matter more than single arguments: one tense moment feels random, but repeated triggers become actionable. Finally, for something this personal, trust is not a feature, it is the product.

What's next for ThirdParty

  • The system automatically identifies pivotal moments: laughter spikes, escalation shifts, reconciliations, deep exchanges, and surfaces them as highlights.
  • Two users can merge perspectives into a partner-safe shared summary that removes adversarial framing and highlights mutual ground.
  • Image recognition adds environmental metadata (location, setting, group context) to conversation logs, giving relational context, not just transcripts.

Built With

Share this project:

Updates