India has fewer than 1 psychiatrist per 100,000 people, and mental health stigma means most people never seek help. We realized the gap isn't just access to professionals — it's that people lack the vocabulary to even name what they're feeling. We wanted to build something that meets people where they are: a non-judgmental space to explore emotions, learn evidence-based coping skills, and know exactly when and where to reach out for professional help. Not therapy — literacy.
MindMap is an AI-powered mental health literacy companion. Users can have a natural conversation to help them articulate and understand their emotions. It offers interactive coping tools — box breathing and 5-4-3-2-1 grounding exercises — directly in the interface. A dual-layer crisis detection system (client-side regex + server-side check) instantly surfaces Indian crisis helplines (iCall, Vandrevala, AASRA, NIMHANS) when distress signals are detected. It stores nothing — no accounts, no chat history, no analytics — conversations exist only in your browser tab.
We built MindMap with Next.js 16 (App Router), React 19, TypeScript, and Tailwind CSS v4. The backend is a single streaming API route that connects to OpenAI's GPT-4o with a carefully crafted system prompt that enforces hard safety boundaries — no diagnoses, no medication advice, culturally sensitive language. The chat uses server-sent events for real-time token streaming. Crisis detection runs in parallel on both client and server so the alert fires instantly, even before the model responds. The whole thing is containerized with Docker (standalone Next.js output) and reverse-proxied through nginx.
Getting the safety boundaries right was the hardest part. We had to ensure the model never crosses into diagnosis or treatment territory while still being genuinely helpful. Balancing the crisis detection sensitivity was tricky — too aggressive and every mention of sadness triggers an alert, too loose and real distress slips through. On the technical side, streaming SSE through Docker + nginx required careful proxy configuration (disabling buffering, upgrading connections), and we hit build-time failures because the OpenAI client was being instantiated at module scope during Next.js page collection.
The dual-layer crisis detection — regex on the client fires the alert immediately while the server confirms it in parallel, so there's zero delay in showing helpline numbers when someone is in distress. We're also proud that the ethics document isn't an afterthought: it drove every design decision, from "no accounts" to "no streaks or notifications" to the disclaimer that's always visible. The app does one thing and does it responsibly.
Building AI products for mental health forced us to think about harm before features. We learned that what you prevent the model from doing matters more than what you let it do. We also learned the practical challenges of deploying Next.js in containers — standalone output, environment variable handling at build vs. runtime, and SSE streaming through reverse proxies. And we gained a deeper understanding of India's mental health landscape and the cultural nuances that matter in this space.
Multi-language support — crisis detection and conversations in Hindi and other regional languages, since the people who need this most may not be comfortable in English. Adding more interactive coping tools like progressive muscle relaxation and cognitive reframing exercises. Exploring on-device/local models to eliminate any data leaving the user's device entirely. And partnering with mental health organizations in India to validate the approach and ensure we're genuinely helping, not just building tech.
Built With
- ai
- api
- azurevm
- docker
- express.js
- nginx
Log in or sign up for Devpost to join the conversation.