💡 Inspiration
We often wonder how others perceive us — our tone, our reactions, our presence in conversations. But what if we could experience ourselves the way others do?
That’s the idea behind MirrorMind: a tool for self-reflection powered by AI, voice cloning, and personality analysis.
When we discovered Bolt.new, it gave us the perfect foundation to bring this vision to life. From project setup to voice-AI integrations, Bolt.new made complex architecture feel intuitive — so we could focus on the emotion, not the boilerplate.
🧠 What it does
MirrorMind allows users to:
- Take a quick psychological quiz to assess their current personality traits.
- Upload a voice sample (or use a default one).
- Chat with an AI version of themselves — shaped by their own traits.
- Hear responses spoken back in their own voice, with tone and attitude aligned to their weakest trait (if any).
It’s a voice-powered mirror of “you right now” — raw, confronting, and eye-opening.
🛠 How we built it
This entire project was initialized on Bolt.new, which handled:
- Project scaffolding (React + Vite + Tailwind).
- Secure OpenAI and ElevenLabs integrations via Supabase Edge Functions.
- Authentication setup and routing.
We then expanded it with:
- A custom quiz engine using Context API to track user traits.
- Dynamic rendering of the radar chart for results visualization.
- Interactive chat interface where AI leads the conversation based on personality weaknesses.
- Voice cloning and TTS with ElevenLabs (based on Bolt's examples).
- Secure backend calls via Supabase, avoiding any client-side key exposure.
Bolt.new’s examples and documentation gave us the confidence to push boundaries in multi-modal interaction and emotional UX — without getting stuck in low-level setup.
🧩 Challenges we ran into
- Managing async voice cloning and playback smoothly across browsers.
- Designing emotionally intelligent AI prompts to reflect traits in a believable way.
- Persisting user state (traits, voice, chat context) without bloating the frontend.
- Balancing performance and emotional depth within a short hackathon window.
🏆 Accomplishments that we're proud of
- A fully functional prototype combining OpenAI + ElevenLabs + Supabase — all in one flow.
- A chat experience that feels personal and confronting, yet empathetic.
- A voice interaction that lets people hear how they emotionally affect others.
- Staying true to the goal: building something deeply human with the power of AI.
📚 What we learned
- Building emotional AI is about tone and delivery, not just raw intelligence.
- Voice adds a layer of reality that text alone cannot match.
- Bolt.new dramatically speeds up secure, real-world AI app development — without sacrificing flexibility.
- The best user experiences often come from emotion-first, tech-second thinking.
🚀 What's next for MirrorMind (Talk to yourself. Know yourself)
We plan to explore:
- Multiple AI personalities (e.g. “your younger self”, “your confident self”, etc.).
- Long-term journaling and personality tracking.
- Voice tone modulation based on emotional context.
- A toolset for coaches and therapists to help clients hear how they come across.
Ultimately, MirrorMind aims to become a mirror for your soul — not just your voice.
Built With
- bolt.new
- context-api
- elevenlabs-api
- framer-motion
- lucide-icon
- openai-api
- react
- supabase
- supabase-edge-functions
- tailwind-css
- typescript
- vite
Log in or sign up for Devpost to join the conversation.