Inspiration
Our inspiration came from a pattern we all recognize but rarely solve: the 3 AM motivation spike. There are moments when everything feels clear and possible, when we decide to change our habits, pursue a goal, or become a better version of ourselves. In that moment, motivation feels powerful and permanent. But, within days, that clarity weakens, routines take over, and we slip back into familiar habits. Some people are fortunate enough to have friends or family who reinforce their motivation daily. Others try to self-motivate, but consistency is difficult without reinforcement. We realized the problem is not ambition, it is sustainability. So we asked ourselves: what if everyone had a consistent, emotionally intelligent companion that showed up every day, remembered their goals, and supported them without judgment? That question became the foundation of our project.
What it does
Our project is an AI-powered emotional companion designed to sustain motivation through consistent, personalized reinforcement. Users can interact with it through a dashboard chat interface, a browser-based voice check-in, or even a real scheduled phone call. Regardless of the mode, the system connects to a shared emotional reasoning engine that understands the user’s goals, habits, emotional history, and long-term patterns. When a user checks in, the AI analyzes their current mood and context, suggests a small and realistic micro-action, and asks a completion question to reinforce accountability. It never shames or overwhelms. Instead, it focuses on emotional grounding and small behavioral momentum. Over time, the system builds a memory of the user’s emotional patterns and progress toward goals, making conversations feel continuous rather than repetitive. The app essentially acts as the supportive friend that many people do not consistently have, someone who remembers what you care about and gently nudges you forward.
How we built it
We designed MindMate as a full-stack AI emotional companion that blends conversational AI, voice technology, and behavioral tracking into one cohesive experience.
On the frontend, we built a responsive, mobile-first interface using HTML, CSS, JavaScript, and React (Vite). The dashboard allows users to track moods, log check-ins, view insights, and switch between standard chat mode and “Future Self” mode. We also integrated the Web Speech API for browser-based voice check-ins.
On the backend, we used Node.js and Express to create an /analyze endpoint that handles user input from chat, voice check-ins, and phone calls. This endpoint connects to the GPT-4.0 API, which analyzes emotional tone and returns structured JSON including mood, supportive responses, micro-actions, and reflection prompts.
For live voice interaction, we integrated Twilio to manage phone calls and capture user speech. The transcript is sent to GPT-4.0 for analysis, and the AI’s response is converted into natural-sounding audio using the ElevenLabs API before being played back to the caller.
User data — including profiles, check-ins, and emotional summaries — is stored in Firebase Firestore, enabling real-time updates and scalable data management.
By combining conversational AI, voice technology, and structured emotional insights, we created an experience that feels supportive, reflective, and growth-oriented rather than just another chatbot.
Challenges we ran into
One of our biggest challenges was integrating multiple APIs into a seamless experience. Coordinating GPT-4.0, Twilio, ElevenLabs, and Firebase required careful handling of asynchronous requests, structured responses, and error management. Ensuring that GPT returned consistent, structured JSON (mood, micro-actions, reflection prompts) was especially tricky and required prompt refinement and response validation.
Connecting the frontend and backend was another major hurdle. We had to make sure the React frontend correctly communicated with our Node.js/Express server through the /analyse endpoint, while managing loading states, handling errors gracefully, and preserving UI responsiveness. Debugging CORS issues, environment variables, and API authentication tokens also took time during integration.
The voice call flow added another layer of complexity. We needed Twilio to capture speech, pass it to the backend, send it to GPT for analysis, convert the AI’s response into audio using Eleven Labs, and then stream it back to the user — all within seconds to keep the experience natural.
Despite these challenges, solving them helped us better understand full-stack architecture, API orchestration, and how to build a reliable AI-powered user experience under hackathon time constraints.
Accomplishments that we're proud of
We’re most proud of bringing our original idea to life in such a short time. MindMate started as a concept for an AI emotional companion that goes beyond simple chat — and we successfully implemented most of the core features we envisioned, including mood tracking, structured AI responses, future-self mode, and a responsive dashboard.
One of our biggest accomplishments was making the phone call feature work. Integrating Twilio for live voice calls, connecting it to GPT-4.0 for emotional analysis, and using Eleven Labs to convert responses back into natural-sounding speech was a complex pipeline — and seeing it function end-to-end was a huge milestone for us.
What we learned
Throughout building MindMate, we learned how complex it is to orchestrate multiple APIs into one seamless user experience. Integrating GPT-4.0, Twilio, ElevenLabs, and Firebase taught us the importance of handling asynchronous workflows, managing structured AI outputs, and designing systems that are both scalable and resilient.
We gained hands-on experience connecting a React frontend to a Node.js/Express backend, debugging CORS issues, securing environment variables, and ensuring smooth communication between components. Building the phone call feature especially deepened our understanding of real-time voice processing pipelines — from capturing speech to generating AI responses and converting them back into audio.
Beyond technical skills, we also learned how critical UX design is when building emotional-support tools. Tone, responsiveness, and clarity matter just as much as functionality. Hackathon time constraints pushed us to prioritize, iterate quickly, and collaborate efficiently as a team.
Most importantly, we learned how to transform an ambitious idea into a working full-stack AI product under pressure — and that building meaningful technology requires both strong engineering and thoughtful human-centered design.
What's next for MindMate
Our next step is turning MindMate from a powerful prototype into a fully polished, scalable product. We plan to strengthen backend stability, improve real-time data syncing, and refine the emotional analysis to become more personalized over time.
We want to implement full user authentication, persistent chat history, and deeper long-term memory so MindMate can better understand each user’s growth journey. Expanding the insights dashboard with richer analytics and trend visualization is also a priority.
On the voice side, we aim to improve real-time call responsiveness and explore proactive check-in calls based on mood patterns and user behavior. We also plan to introduce reminders and smart nudges to help users stay consistent with their goals and habits.
Long term, we envision MindMate becoming a daily growth companion — available across web and mobile — that combines emotional intelligence, habit science, and AI into a supportive, always-accessible experience.
Built With
- and-javascript-on-the-frontend
- and-node.js-+-express-on-the-backend.-it-integrates-gpt-4.0-for-emotional-ai-responses
- css
- elevenlabs-for-ai-voice-synthesis
- elevenlabsapi
- express.js
- firebase
- firestore
- geminiapi
- html
- javascript
- node.js
- react
- twilio-for-voice-call-support
- twilioapi
Log in or sign up for Devpost to join the conversation.