MindMate AI

๐Ÿง  Inspiration

Mental health remains one of the most pressing yet underserved aspects of human well-being. While text-based support exists, emotional states are often hidden in voice tone and facial expressions โ€” cues missed by most AI apps. We wanted to build a multi-modal mental health companion that truly listens, sees, and understands โ€” using cutting-edge AI to deliver empathetic support and actionable insights.


๐Ÿ’ก What it does

MindMate AI is a multi-modal emotional wellness assistant that allows users to check in via text, audio, or video. It uses AI to analyze:

  • Mood (emotion classification)
  • Voice tone (prosody analysis)
  • Facial expressions (emotion detection)
  • Stress score and confidence level

Based on this, it generates:

  • Personalized mental health suggestions
  • Real-time alerts (e.g. โ€œYou sound highly stressedโ€)
  • Visual dashboards showing emotional trends
  • A developer API to embed emotional intelligence into any wellness or productivity app.

๐Ÿ›  How we built it

  • Frontend: Next.js, TailwindCSS, React Webcam, React Audio Recorder
  • Backend: FastAPI (Python)
  • AI Integration:
    • OpenAI GPT-4o for text + tone-based emotion classification
    • Whisper for audio transcription
    • Parselmouth/Librosa for pitch/tempo/tone analysis
    • Face-api.js for facial emotion recognition
  • Storage: Bolt.dev storage for media files
  • Database: Supabase for check-in data and user sessions
  • APIs: REST endpoints for /checkin, /insights, /export-csv

๐Ÿง—โ€โ™‚๏ธ Challenges we ran into

  • Extracting reliable voice tone metrics (prosody) and mapping them to emotions
  • Combining multi-modal inputs (text + voice + facial data) into a unified mood score
  • Real-time media processing without latency or overload
  • Ensuring user privacy and ethical handling of sensitive emotional data

๐Ÿ† Accomplishments that we're proud of

  • Seamlessly integrated audio and video input in a polished, production-ready interface
  • Built a unified emotional analysis pipeline from 3 different modalities
  • Delivered a functional developer API that could power other wellness apps
  • Created real-time alerts and personalized mental health suggestions based on AI insights

๐Ÿ“š What we learned

  • How to apply GPT-4oโ€™s new multi-modal capabilities effectively
  • Techniques for extracting meaningful emotional cues from raw audio and video
  • The importance of emotional nuance in mental health tools โ€” and how hard it is to get right
  • Best practices in privacy-first mental health AI design

๐Ÿ”ฎ What's next for MindMate AI

  • ๐Ÿงช Test with real users, therapists, and wellness coaches
  • ๐Ÿ” Add end-to-end encryption and deeper privacy controls
  • ๐ŸŒ Launch API access for partner platforms (therapy apps, productivity tools)
  • ๐Ÿง  Fine-tune models for specific user states (e.g., burnout, panic, fatigue)
  • ๐Ÿ“ฑ Build a mobile app for seamless, on-the-go check-ins

Built With

  • bolt
Share this project:

Updates