MindBridge - Devpost Submission Answers


Inspiration

People who are struggling are already in your group chats. They just go quiet. We kept asking why there was no mechanism to notice that, and MindBridge came out of that question. We were also inspired by Dario Amodei's essay Machines of Loving Grace. If AI can compress decades of mental health progress, one answer is simply catching the moments humans miss and getting the right people talking to each other.


What it does

MindBridge sits in any Discord server and privately messages members who seem to be struggling. No public flag, no awkwardness in the group. In that private conversation they can vent to an AI that actually listens, get connected to a peer from their own community who volunteered to help, or get helpline numbers. If someone says something that signals a crisis the bot immediately shares emergency resources and stops the conversation. There is also a weekly mood check where members anonymously rate their week and the group gets back only the average, so communities can look out for each other collectively.


How we built it

We used Python and discord.py for the bot itself. The Anthropic Claude API does two things: it scores every server message for distress in the background, and it also powers the full conversation when someone wants to vent. We wrote a detailed system prompt to make sure the bot stays warm and helpful without ever crossing into territory it shouldn't. Everything runs in memory with no database, so no personal data is ever stored.


Challenges we ran into

Getting the distress detection sensitive enough to catch real signals without constantly messaging people who were just venting about an assignment. Writing a system prompt that felt like talking to a real person and not a wellness app. Building the peer connector in a way where neither person feels put on the spot. Basically the whole challenge was making something that feels human without overclaiming what it can do.


Accomplishments that we're proud of

Honestly the peer supporter connector is what we are most proud of. The whole point is that AI should find the human, not replace them. We are also proud of how we handled the crisis escalation, the bot just stops and hands off to real resources immediately. And we built all of this in under an hour which felt pretty good.


What we learned

The hardest thing in mental health tech is knowing when to stop. Every feature had a version that went too far and felt invasive. The real work was restraint, figuring out the minimum the bot needs to do to get two people talking, and then getting out of the way.


What's next for MindBridge

We want to bring this to WhatsApp and Telegram so it works for groups that are not on Discord. Slack integration for workplaces is a big one too since burnout there is just as invisible. Longer term we want to work with college counselling centres who need a way to understand how students are doing without collecting any individual data.

Built With

Share this project:

Updates