Inspiration
We were inspired by the growing need for accessible mental health resources and how AI can provide real-time, personalized support. As Stanford students immersed in a stressful college environment, we realized many people young and old face barriers to traditional therapy, such as time, cost, and availability, and we wanted to create a solution that addresses these issues. By combining AI-driven emotional analysis with schedule integration, we saw an opportunity to build a tool that offers meaningful, timely support in a flexible, tech-driven world.
What it does
Aurora is an AI-powered therapist that listens to your voice and analyzes your facial expressions to understand your emotional state. It adjusts its responses based on your mood and volume, offering a personalized therapy experience. Aurora also integrates with your schedule through the Google Calendar API, providing therapy sessions that are more specific to your day-to-day activities. It ensures your mental well-being is always prioritized, even when you're short on time.
How we built it
Aurora was built as a web app using TypeScript, React, and Tailwind CSS to create an adaptive and seamless user experience. Hume AI powers the emotional analysis, allowing us to capture nuanced emotional cues from users' voice and facial expressions. Deepgram handles speech transcription, while the Google Calendar API integrates scheduling information. We use Firebase to store user data and Gemini 1.5 Flash for generating thoughtful, personalized responses. Our Flask servers manage mic status, voice decibel tracking, and audio playback to make the interaction feel seamless and responsive.
Challenges we ran into
One of the toughest challenges we faced was integrating the emotional and semantic analysis models from Hume with Deepgram’s speech-to-text model. Synchronizing these models in real-time was critical for ensuring that Aurora’s responses were both emotionally intelligent and contextually accurate. Since we relied heavily on multiple API-based models via web sockets, it was crucial to maintain seamless synchronization between them, especially when one model didn’t return optimal results. Robust error handling became a key focus to ensure that the user experience wasn’t compromised.
We also faced hurdles with the specific requirements of certain text-to-speech APIs. These APIs often had rigid input specifications that clashed with the outputs from our emotional and semantic models, leading to some tricky data flow issues. Finally, our desire to use streaming versions of the models for optimized performance posed additional difficulties. Streaming added complexity when transferring data between models, the backend, the database, and the UI, leading to bottlenecks that we had to troubleshoot to ensure smooth, real-time interactions.
Accomplishments that we're proud of
We’re proud of successfully integrating a wide range of complex systems—from emotional analysis to real-time mic monitoring—and making them work together in a seamless user experience. We set out to tackle a highly ambitious task, knowing that the number of models, the amount of synchronization, and the level of communication required between various processes would be a significant technical challenge. On top of that, we aimed to optimize performance using streaming models for faster, real-time inference, which added even more complexity to the project.
Despite these challenges, we managed to bring all of these elements together into a coherent and fully functional product. We’re proud not only of the technical achievements but also of the impact our solution could have in making mental health support more accessible. The fact that we could take on such a difficult task and deliver a polished, responsive web app that feels intuitive and natural is something we’re excited to showcase.
What we learned
Through Aurora’s development, we learned the incredible potential that arises when cutting-edge AI meets deep human empathy. Working with tools like Hume AI and Deepgram taught us how to harness real-time emotional data to craft experiences that resonate on a personal level. We also gained invaluable insights into designing systems that are not only technically robust but also user-centric, ensuring that every interaction feels fluid, meaningful, and intuitive. Most importantly, we learned the value of balancing technical innovation with emotional intelligence, a combination that can turn technology into a source of comfort and support.
What's next for Aurora
In the near future, we plan to introduce new features that enhance both functionality and user experience. One of our top priorities is to add journaling capabilities, allowing users to log their thoughts and emotions after each session, creating a personal mental health record they can revisit. We're also looking into building in-session note-taking, where Aurora can automatically summarize key insights and actions from each therapy session, providing users with clear takeaways. Another exciting feature is push notifications for personalized check-ins—Aurora will remind users to take time for their mental well-being, especially during stressful periods, by analyzing their calendar. Additionally, we plan to implement a multi-language feature to make Aurora accessible to a wider, global audience. And, to make sessions even more engaging, we’ll explore the integration of guided meditations and breathing exercises directly within the app.
Built With
- deepgram
- firebase
- flask
- gemini
- google-calendar
- hume
- react
- tailwind
- typescript
Log in or sign up for Devpost to join the conversation.