๐Ÿ’ก Inspiration

The 988 Suicide & Crisis Lifeline receives millions of calls annually, but volunteer burnout is at an all-time high. Why? Because training is static. Volunteers often learn from PDFs or low-fidelity roleplay. When they face their first real, screaming, terrified caller, they freeze.

We realized that to save lives, volunteers need a "Flight Simulator" for empathyโ€”a safe place to fail, freeze, and learn before they pick up a real phone.

๐Ÿค– What it does

CrisisLink AI is a hyper-realistic voice simulator where the user plays the role of a crisis counselor.

  • The Persona: You are connected to "Alex," a 22-year-old student standing on a bridge. Alex is terrified, shivering, and resistant to help.
  • The stakes: If you are dismissive, robotic, or rude, Alex will scream "YOU DON'T GET IT!" and hang up on you (the simulation ends in failure).
  • The Win Condition: If you use active listening and validate Alex's feelings, the AI's voice softens, it begins to cry, and eventually agrees to seek safety.

โš™๏ธ How we built it

We built CrisisLink by fusing Google Cloud's reasoning with ElevenLabs' emotion.

  1. The Brain (Google Vertex AI / Gemini 2.5 Flash): We used Gemini 2.5 Flash (via the ElevenLabs integration) as the "Director." It doesn't just generate text; it analyzes the user's psychological safety. We engineered a specific System Prompt that forces Gemini to format text phonetically (e.g., "I... I d-don't know...") to simulate hyperventilation.
  2. The Voice (ElevenLabs Conversational AI): We utilized the ElevenLabs Turbo v2.5 model with the "Expressive" voice profile (Stability set to 30%). This allows the Agent to whisper, stutter, and scream in real-time with <300ms latency.
  3. The "Kill Switch" (Agency): Most chatbots are passive. We gave Alex agency. We built a custom Client Tool (endCall) that the Agent triggers autonomously if its "Hostility Threshold" is crossed. This disconnects the WebSocket session immediately, simulating a hung-up call.
  4. The Frontend (Next.js 14): We built a reactive UI that visualizes the "Stress Level" of the call. It listens to the isSpeaking state of the agent to pulse a red warning light when the caller is panicking.

๐Ÿง  Challenges we ran into

  • The "Robotic Stutter" Problem: Initially, when we told the AI to [shiver], it just said the word "shiver." We solved this by using Phonetic Prompting, teaching Gemini to use ellipses and repetition ("I... I can't...") which the ElevenLabs engine naturally interprets as fear.
  • Giving the AI Agency: It was difficult to make the AI "rude" enough to hang up. We had to lower the temperature to 0.9 and explicitly instruct it not to be helpful, flipping the standard "Helpful Assistant" paradigm on its head.

๐Ÿ… Accomplishments that we're proud of

  • Real-Time Latency: The conversation feels instantaneous. There is no awkward "thinking pause," which maintains the illusion of a high-stakes crisis.
  • The Emotional Range: We successfully got the AI to transition from Screaming Anger to Soft Crying in a single session based purely on the user's voice input.

๐Ÿš€ What's next for CrisisLink AI

  • Post-Call Analytics: Using Gemini Pro to generate a "Scorecard" after the call, grading the volunteer on specific de-escalation techniques.
  • Multi-Scenario Support: Adding scenarios for "Domestic Violence," "Panic Attacks," and "Substance Abuse."
  • Enterprise Integration: Deploying this as a training module for 911 dispatch centers.

Built With

  • elevenlabs`
  • gemini`
  • google-cloud`
  • next.js`
  • react`
  • tailwind`
  • typescript`
  • vertex-ai`
Share this project:

Updates