Inspiration
Every high-stakes interview has two sides. Candidates need a realistic, high-pressure arena to practice and build confidence. Meanwhile, employers are facing a terrifying new reality: the rise of real-time AI cheating, deepfake avatars, and script-reading bots. We realized that solving only one side of the equation wasn't enough. We built TrueFace to be a dual-sided platform: the ultimate sparring partner for job seekers, and the ultimate truth-tester for recruiters.
What it does
TrueFace is a comprehensive, dual-sided mock interview and integrity-testing platform. We built a complete suite of tools to serve both candidates preparing for the real world and recruiters protecting their hiring pipelines. For the Interviewee (The Training Arena): 1) Real-Time AI Live Avatar: A hyper-realistic, low-latency conversational partner powered by Gemini and HeyGen that simulates the intense face-to-face pressure of a real interview. 2) Live Speech Analytics: A custom-built, in-browser metric dashboard that intercepts raw audio transcripts to actively track and penalize filler words ("um," "like," "uh") the exact millisecond they happen. 3) Performance Review Dashboard: A full CRUD interface where candidates can save, review, and delete recorded video responses, displayed beautifully right next to the exact behavioral or technical prompt they were asked so they can review their body language and delivery.
For the Interviewer (The Integrity Engine): 1) Real-Time Deepfake Detection: An active monitoring suite that evaluates the candidate's video stream to detect synthetic manipulation and facial overlays, ensuring the person on screen is authentic. 2) Voice & Latency Tracking: Measures unnatural network latency spikes and audio anomalies that typically indicate a candidate is secretly passing audio through a third-party AI transcription/generation tool. 3) Cognitive & Reasoning Evaluation: Employs a dedicated Speech Analyzer and Gemini Analyzer to measure the candidate's live reasoning score against natural human baselines, catching suspiciously perfect or "bot-like" scripted answers. 4) Live Risk Aggregation: Consolidates all of the above data points (deepfake probability, latency, voice anomalies, and reasoning scores) into a single, real-time Risk Aggregation Score so recruiters know instantly if a session is compromised.
How we built it
We built the frontend dashboard using Next.js and Tailwind CSS. The conversational "Brain" is powered by Google Cloud Vertex AI (Gemini 2.5 Flash), connected via strict API routing. The "Face and Voice" utilize the HeyGen LiveAvatar Web SDK, reacting dynamically to Gemini's outputs. For the Integrity Engine, we built a custom evaluation pipeline that ingests the audio and video streams to calculate deepfake heuristics, voice analysis, and latency tracking. We tied everything together with a MongoDB backend to handle full CRUD functionality for user sessions and saved video clips.
Challenges we ran into
1) The Real-Time Tug-of-War: Processing live audio/video feeds for deepfake and latency detection while simultaneously rendering a hyper-realistic streaming AI avatar required intense optimization to prevent the browser from crashing. 2) The "Too Smart" Browser: Chrome's Speech API automatically deletes hesitation markers when finalizing sentences. We had to build a custom interceptor to catch raw, interim transcripts to accurately track filler words. 3) API Routing & Context Bugs: We accidentally triggered a "History Double-Dip" that crashed the Gemini API by breaking its strict conversational alternating rules. 4) Version Control Chaos: At the 11th hour, we hit massive Git merge conflicts while trying to combine our UI polish branch with the Integrity Engine and the main codebase.
Accomplishments that we're proud of
We didn't just build a prototype; we built a dual-sided marketplace tool with massive B2B and B2C value. Successfully wiring a live microphone, a Gemini thinking engine, a real-time speaking avatar, and a concurrent AI-detection tracking suite all together without massive latency is a huge win. We are incredibly proud of overcoming React Ref race-conditions to get the local webcam, avatar, and live metrics all running smoothly on one screen.
What we learned
We learned the hard way that you should always check the backend server logs instead of trusting the browser console. We deepened our understanding of React state management and hardware access streams. Crucially, we dove deep into the wild world of AI detection, learning how latency spikes and reasoning heuristics can successfully expose synthetic interactions and deepfakes.
What's next for TrueFace
We plan to expand our Integrity Engine into a standalone enterprise API, allowing existing platforms (like Zoom or Google Meet) to plug in our deepfake and AI-assistance detection. For the candidate side, we want to integrate Google MediaPipe, a lightweight, in-browser vision model, to track user eye contact, posture, and fidgeting, offering even deeper behavioral feedback.
Built With
- css
- gemini
- geminiapi
- liveavatar
- mongodb
- next
- pinecone
- python
- typescript
Log in or sign up for Devpost to join the conversation.