Inspiration
Technical interviews are often stressful and unpredictable. We wanted to create a platform that mimics the real-world interview environment while giving candidates actionable, AI-powered feedback. This helps job seekers not just practice but actually improve with each attempt.
What it does
The AI Mock Interview Simulator lets users:
Browse job-specific coding challenges
Solve them in a LeetCode-style editor
Submit solutions and receive AI-powered code review
View performance analysis, time/space complexity, suggestions, and optimized solutions
It bridges the gap between endless prep and meaningful practice.
How we built it
Backend Database: MongoDB is used to persist job listings and coding questions.
Schemas:
Job: Contains job metadata and embedded coding questions.
InterviewSession: Stores each user's code submission, feedback, and performance metrics (can be extended for history/tracking).
Mongoose Models:
JobSchema includes fields like title, company, location, description, jobType, postedDate, logoUrl, and a nested codingQuestion sub-document (with fields like title, description, difficulty, and examples).
InterviewSessionSchema includes jobId, submittedCode, language, feedback, and timestamps.
Endpoints:
GET /api/jobs: Fetch all jobs from MongoDB.
POST /api/jobs: Add a new job (with validation).
GET /api/jobs/:id: Fetch a specific job using its Mongo _id.
GET /api/interview/:jobId: Load the question from a job’s record.
POST /api/code/submit: Receives user code, calls AI service, stores the result in InterviewSession.
⚙️ Tech Stack Frontend: React + TypeScript, Tailwind, ShadCN UI
Code Editor: Integrated Monaco or CodeMirror for syntax-aware editing
AI Integration: Simulates code review using mock AI logic (can integrate OpenAI)
MongoDB : Main backend data store
Node.js + Express: RESTful API layer
Challenges we ran into
Designing flexible MongoDB schemas that support nested coding questions while staying scalable.
Handling code execution sandboxing (mocked for now).
Structuring feedback data models so that future analysis (e.g. improvement over time) is possible.
Creating a clean UX that doesn’t overwhelm users.
Accomplishments that we're proud of
A complete full-stack workflow: from job listing → coding interface → AI feedback.
Seamless MongoDB integration with reusable and normalized schemas.
A job-based approach instead of random problems, making it feel closer to real interviews.
Feedback that includes complexity analysis and improved versions of user code (twoSum, reverse, etc.).
What we learned
How to model job data and interviews in MongoDB effectively.
How to simulate code analysis using structured AI-like response objects.
Creating a modular React UI that adapts to job types and question formats.
Managing state and navigation in a multi-step coding environment.
What's next for AI Mock Interview Simulator
Leaderboard & badges based on job roles and performance.
Interview transcripts and coaching suggestions.
Voice-based mock interviews for behavioral rounds.
Improve AI feedback granularity—e.g., line-by-line feedback or style improvements.
Log in or sign up for Devpost to join the conversation.