Inspiration
Mental health is a critical need, yet therapists often lack tools to capture emotional cues like facial expressions and tone of voice. We wanted to create something that enhances therapy without replacing it. Inspired by the idea of blending emotional intelligence with AI, we built Baymax, a smart assistant that empowers both therapists and patients.
What it does
Baymax is a two-sided platform.
- For therapists, it offers a dashboard to manage patients, track emotional trends, and record personalized notes.
- For patients, it provides an empathetic AI-driven chatbot that adapts to their emotional state, detected through facial expressions and voice tone.
The goal is to enhance the human connection between therapist and patient with supportive technology.
How we built it
- Frontend: React and TailwindCSS for responsive, user-friendly interfaces.
- Backend: Node.js with Express.js to manage authentication, conversations, and therapist-patient data securely.
- AI and API Integration:
- Real-time facial emotion recognition using external APIs
- Voice tone analysis
- Large Language Model API to power empathetic chatbot responses
- Real-time facial emotion recognition using external APIs
- Hosting: [Insert hosting platform, for example Railway, GoDaddy, or Netlify]
Challenges we ran into
- One major challenge was integrating MongoDB live. We successfully set up the cluster, created collections, and loaded documents. However, when merging all team code, we ran into a "mongo variable not found" error during runtime.
- Integrating emotion detection APIs into a real-time app was complex, especially syncing video capture with emotion classification.
- Designing smooth workflows for both therapists and patients without making the experience feel cluttered.
- Managing API rate limits and ensuring fast enough response times for natural conversations.
- Prioritizing features under time pressure during a hackathon environment.
Accomplishments that we're proud of
- Built a full-stack platform that detects and responds to user emotions in real time.
- Designed a tool that feels meaningful for both therapists and patients rather than just being technology for its own sake.
- Integrated facial and voice emotion data into a conversational AI system effectively.
- Worked efficiently as a team to deliver a functioning and polished demo within the deadline.
What we learned
- Deep integration between frontend and backend is crucial when building real-time emotion-based applications.
- Good user experience design is essential, especially when working in sensitive areas like mental health.
- AI tools become more empathetic when they are grounded in real human emotions rather than just text inputs.
- Always plan for fallback paths like, being able to demo core features even if one service (like MongoDB) hits issues.
What's next for Baymax!
-MongoDB Full Integration: We'll properly rewire database access after the hackathon to store conversations, mood histories, therapy notes, etc.
- Therapist Analytics: Adding deeper insights such as emotion trend graphs and patient progress over time.
- Live Session Support: Enabling therapists to track emotions during live virtual therapy sessions.
- Expanded Emotion Detection: Capturing a wider range of subtle human emotions.
- Security and Compliance: Moving toward HIPAA compliance for use in real-world clinical settings.
- Mobile Version: Developing a mobile-friendly app to expand accessibility for therapists and patients.
Log in or sign up for Devpost to join the conversation.