Inspiration
Watching friends struggle to access mental health support during tough times hit hard. Months-long wait lists, expensive sessions, and the loneliness of 3am anxiety spirals with no one to talk to. We realized AI could help—not as a replacement for therapy, but as a bridge. The problem? Most chatbots are emotionally tone-deaf. We wanted to build something that actually understands how you're feeling before it tries to help.
What it does
EmoWise is an AI therapy assistant that reads your emotions and responds with real empathy. Here's the flow: You type how you're feeling Our ML model detects your emotion (sadness, anxiety, anger, joy, etc.) in real-time That emotional context gets passed to LLaMA along with CBT therapy frameworks You get a personalized response that validates your feelings and offers evidence-based coping strategies
How we built it
Emotion Detection: Fine-tuned DistilRoBERTa on the GoEmotions dataset to classify 7 core emotions with high accuracy LLaMA Integration: Engineered therapeutic system prompts based on CBT/DBT principles that guide the AI to respond empathetically Pipeline: Built a Python backend connecting emotion analysis → context-enhanced prompting → response generation Crisis Safety: Implemented keyword + ML-based crisis detection with immediate resource provision Frontend: React interface with real-time emotion visualization and conversation history
Challenges we ran into
Teaching AI empathy: Getting LLaMA to validate emotions without being patronizing was tough. We iterated through dozens of prompt variations. Emotion nuances: Distinguishing between similar emotions (anxiety vs. fear, sadness vs. grief) required careful dataset selection and model tuning. Safety first: Balancing supportive responses with crisis detection was critical. We couldn't miss high-risk signals, but also couldn't over-trigger on normal sadness. Avoiding therapeutic harm: Making sure the AI doesn't reinforce negative thinking patterns or give bad advice took careful prompt engineering and safety guidelines.
Accomplishments that we're proud of
Built a working emotion detection model with 85%+ accuracy Created therapeutic prompts that generate genuinely empathetic responses Integrated crisis detection that could genuinely save lives Made something that actually feels like talking to someone who cares Bridged machine learning and psychology in a meaningful way The biggest win? When we tested it ourselves during stressful moments, it actually helped. That's when we knew we had something real.
What we learned
Technical: Prompt engineering is an art. Small changes in how you frame the system prompt dramatically affect response quality. Also, emotion classification is harder than sentiment analysis—context matters hugely. Human side: We dove deep into CBT frameworks, therapeutic communication, and crisis intervention protocols. Building for mental health isn't just about cool tech—it's about responsibility and ethics. The balance: AI can provide support, but we learned where to draw lines. It's a tool to complement human care, not replace it. That perspective shaped every design decision.
What's next for EmoWise
Add voice tone analysis for multimodal emotion detection Build a mood tracking dashboard to visualize emotional patterns over time Implement personalized intervention learning (what coping strategies work best for each user)
Log in or sign up for Devpost to join the conversation.