Inspiration

Most apps track how you feel, but not who caused it and why it keeps repeating. In real life, the biggest drivers of anxiety, burnout, and emotional confusion are often specific interactions with specific people. Cloud AI also feels wrong here: the data is intimate, and the moments that matter happen offline (subway, travel, dead-zones). I wanted a “relationship flight recorder” that works anywhere and keeps everything private by default.

What it does

Emome BlackBox Mode is an offline-first relationship recorder. After a call/meeting/date/argument you tap one button and speak a 10-second debrief (or type). In seconds it generates:

  • a structured Interaction Event (strict JSON)
  • a Reasoned Explanation of what likely triggered your reaction
  • an updated Relationship Pulse (time-weighted signal per person)
  • a one-sentence Micro-Script (boundary / question for next time) Everything stays on-device, with export + wipe controls.

How we built it

We built a privacy-first prototype optimized for “zero cloud dependency”:

  • Next.js + React + TypeScript UI for fast capture and instant feedback
  • Local parsing + tagging + scoring to convert raw debriefs into strict event JSON
  • A Relationship Pulse update loop (time-weighted smoothing) with per-person history
  • A Reasoning Trace panel (tags, eventScore, intensity/duration weights, pulse delta before→after) to make outputs auditable
  • IndexedDB storage for offline persistence, plus export/wipe tooling
  • PWA-ready setup so the experience remains usable offline after first load

Challenges we ran into

  • Offline voice capture: browser speech recognition isn’t consistently available offline on every device, so we implemented a typed fallback + demo mode to guarantee the wow-moment.
  • TypeScript + web platform quirks: getting build-safe typing for speech APIs and keeping deployment stable on Vercel required careful shims/config.
  • Trust & explainability: “AI said so” isn’t enough for sensitive relationship data, so we designed the reasoning trace and pulse delta to show why a conclusion was made.

Accomplishments that we're proud of

  • Delivered a working “flight recorder” loop: record → auto-analyze → save in a single flow.
  • Built an offline-first experience with local storage, export, and wipe—no accounts, no tracking.
  • Added transparent reasoning (trace + pulse delta) so users can verify outputs instead of blindly trusting them.

What we learned

Edge-first products need architecture that enforces privacy, not just promises it. Also, explainability isn’t a luxury in emotional tools—it’s the difference between “cool demo” and something people actually trust enough to use daily.

What's next for Emome BlackBox Mode

  • Integrate full on-device inference using RunAnywhere orchestration with a quantized DeepSeek reasoning model (two-tier routing: fast tagger/formatter always-on, reasoning model on-demand).
  • Add optional local encryption (passcode) for BlackBox logs.
  • Improve offline voice support with an embedded on-device STT option where possible, plus richer trend visualizations (weekly summaries, trigger recurrence, and repair patterns).

Built With

  • indexeddb-(idb)
  • next.js-(app-router)
  • pwa-(service-worker-+-web-app-manifest)
  • react
  • tailwind-css
  • typescript
  • uuid
  • vercel
Share this project:

Updates