Inspiration

The inspiration for this project came from a simple but powerful question: What if everyone had a non-judgmental, medically informed companion available 24/7? We were heavily inspired by the character Baymax from Big Hero 6. In a world where mental health resources are often expensive, waitlisted, or stigmatized, we wanted to build a "digital first responder." We realized that during a panic attack or a moment of loneliness, people don't always need a doctor immediately they need presence. They need something that says, "I am here," and helps them regulate their physiology before things spiral out of control.

How I Built It

We built Baymax as a responsive web application using a modern tech stack designed for speed and visual fluidity. The Tech Stack Core: React (TypeScript) + Vite Styling: Tailwind CSS for the clean, clinical-yet-warm aesthetic. Animation: Framer Motion was critical. We didn't want a static chatbot. We built a complex animation system where the avatar's "breathing," color, and body language (head tilts, arm rotation) react dynamically to the user's emotional state.

Intelligence : Google Gemini API. We prompt-engineered Gemini not just to answer text, but to output emotional metadata. When the AI detects "sadness," it doesn't just write a comforting message; it signals the frontend to change the avatar's aura to a warm green and lean in. Voice: Web Speech API for hands-free accessibility.

The "Panic Mode" Algorithm The core feature is the Dynamic Breathing Protocol. We implemented a mathematical pacing system to guide users out of hyperventilation. Using a dynamic variable representing the duration of one breath phase, we programmed the application to match the user's panic state and gradually slow them down:

Where is the cycle number. Cycle 0: (Fast, matching high anxiety) Cycle 1: (Slowing down) Cycle 2+: (Target "Box Breathing" state) This creates a physiological "entrainment" effect, leading the user to calmness rather than just telling them to "calm down."

🚧 Challenges We Faced The "Uncanny Valley" of Empathy: Early versions felt robotic. If the AI said "I understand" but looked static, it felt fake. We overcame this by mapping specific emotional states (Concerned, Supportive, Curious) to specific color palettes and animation spring physics. A "Concerned" state has faster, sharper movements (red/pink), while a "Supportive" state has slow, fluid damping (green/teal). Animation Synchronization: Syncing the React state (text prompts) with the CSS/Framer Motion animations (visual expansion) was difficult. We had to refactor our use Effect hooks to ensure that when the text said "Inhale," the visual ring was actually expanding, even as the duration of the breath changed dynamically from 2 seconds to 4 seconds.

Accessibility in Crisis:

We realized that a user having a panic attack might not be able to type. We added the "Wind Icon" (Panic Button) as a single-tap solution that bypasses the chat and immediately launches the breathing intervention.

🧠 What I Learned UX for Crisis: Designing for high-stress situations requires removing friction. Big buttons, soothing colors, and minimal text are vital. Prompt Engineering as UI: We learned that we can use the LLM to control the interface, not just the text. By asking Gemini to categorize the user's intent (e.g., interaction mode: 'breathing exercise'), we turned a text generator into a state machine controller. The Power of "Presence": Even a simple animated abstract shape can feel "alive" if it reacts instantly to your voice.

What's Next? We plan to integrate biometric feedback (via smartwatches) to automatically trigger the "Concerned" state when a user's heart rate spikes, making Baymax truly proactive rather than reactive. "I am satisfied with my care." The ultimate goal for every interaction.

Built With

Share this project:

Updates