Inspiration
Interview anxiety is one of the biggest barriers to career success. You can know all the right answers, but if your nerves take over, your performance suffers. Traditional mock interviews don't address this because they can't see what's happening inside you. I wanted to build an AI coach that could actually read your stress levels in real-time and help you learn to stay calm under pressure.
What it does
Neurocoach is an AI interview coach that monitors your biometrics while you practice. Using your webcam, it tracks your heart rate, stress level, and attention in real-time. The AI adapts its coaching based on your physiological state - calming you down when stress spikes, re-engaging you when focus drops. You can upload your resume and job description for tailored mock interviews with voice interaction.
How I built it
- Frontend: Next.js 14, TypeScript, Tailwind CSS, Framer Motion
- AI: Google Gemini 3.0 Flash for conversations, ElevenLabs for voice responses
- Biometrics: I used Presage and simulated its biometric vital data with alternative libraries (face-api.js, TensorFlow.js) to detect heart rate via rPPG, stress via facial expressions, and attention via eye tracking
- Auth: Auth0 for user authentication
Challenges I ran into
- Heart rate stability - Raw rPPG output was noisy (jumping 40-100 BPM). I implemented median filtering and rate-limited changes to max 5 BPM/second for smoother readings.
- API method chain order - face-api.js requires
withFaceLandmarks()beforewithFaceExpressions(). A subtle bug that took debugging to find. - Real-time performance - Running face detection, rPPG analysis, and AI responses simultaneously required optimization to maintain 30fps video.
Accomplishments that I'm proud of
- Built a working rPPG heart rate detector that runs entirely in the browser
- Created an AI that genuinely adapts its responses based on physiological feedback
- Achieved smooth real-time biometric monitoring with no external hardware
- Full voice interaction for an immersive coaching experience
What I learned
- Remote Photoplethysmography (rPPG) - Heart rate can be detected from subtle facial color changes caused by blood flow
- Signal processing - Bandpass filters, detrending, and frequency analysis for extracting clean signals from noisy data
- Multimodal AI - How to combine physiological data with conversational AI for adaptive experiences
What's next for Neurocoach
- Integration with the official Presage SDK for clinical-grade biometric accuracy
- Session history and progress tracking over time
- Expanded coaching modes (presentations, difficult conversations, sales pitches)
- Mobile app for on-the-go practice
Built With
- auth0
- css
- elevenlabs
- face-api.js
- framer
- gemini
- motion
- next.js
- presage
- react
- tailwind
- tensorflow.js
- typescript


Log in or sign up for Devpost to join the conversation.