💙 Anchor: The AI Companion That Sees You 💙
Ever poured your heart out to an AI, only to get a cold wall of text back? 🤖💔
Standard chatbots are emotionally blind. They treat you the same whether you’re curious, stressed, or actively spiraling.
Enter Anchor. ⚓️
Anchor is the first anxiety-aware AI that uses computer vision to "see" your nervous system state in real-time. It doesn't just read your text; it reads you.
🛠️ How we built it
- The Eyes: Presage 📸 We don't just guess how you feel—we measure it. Presage analyzes the webcam stream to decode biometric signals like stress and engagement, feeding us a live "Nervous System Score."
- The Brain: Google Cloud (Gemini) 🧠 This isn't a static prompt. We built a state-machine engine using Gemini. >Calm Mode: "Let's explore this topic in depth." 📚 >Rising Mode: "Let's keep it simple and structured." 📝 >Spiking Mode: "Let's pause. Breathe with me." 🌬️
- The Voice: ElevenLabs 🗣️ Text isn't enough. We use ElevenLabs to modulate the AI's actual voice—slowing down the cadence and softening the pitch when it detects your stress levels rising.
🏆 Our biggest challenge
Apparently, ElevenLabs thought we were a bot farm! 🤖 Because we were all jamming on the same school Wi-Fi, the API saw too many requests coming from one IP address and laid down the ban hammer hard—locking us out completely in the middle of the night. Disconnecting from the school network and finishing the build tethered to our personal phone hotspots. We drained our monthly data caps to get this demo working, but hey... anything for that perfect AI voice, right?
💡 Inspiration
While we each has our respective anxiety attacks trying to pick a project, upon speaking with our AI chatbots we realized that while they are getting smarter, they aren't necessarily getting kinder.
We wanted to move beyond "intelligent" responses and build empathetic ones. We built Anchor to prove that technology can respect your emotional bandwidth instead of overwhelming it.
Log in or sign up for Devpost to join the conversation.