Inspiration
At the University of Maryland, opportunities for undergraduate and graduate students to work with professors on research or academic roles like TA and grader are everywhere — but the process of finding the right match is fragmented, informal, and often based on word-of-mouth. We wanted to solve this using AI.
What if we could help students intelligently discover professors whose research and open roles truly align with their interests and skills? That’s how TerpMatch was born — a matchmaking platform tailored for UMD that uses cutting-edge LLMs to bridge the gap between students and faculty.
What it does
TerpMatch matches students with professors based on:
- Desired role (RA, TA, Grader)
- Research interests
- Resume-based technical skills
- Gemini LLM-inferred skill requirements based on each professor’s research areas
Students fill out a quick profile and upload their resume. TerpMatch intelligently recommends professors with high compatibility scores. Professors, on the other side, can see students who applied to them, ranked by match score.
It’s like LinkedIn meets AI — but built specifically for Terps 🎯
How we built it
- Frontend: React.js with Axios for REST API integration and resume upload
- Backend: Flask (Python) handles user data, resume management, and scoring logic
- Database: MongoDB stores student and professor profiles
- LLM Integration: Google Gemini API was used to infer technical skills from research interest strings
- Matching Logic: Compatibility is calculated based on role match, research overlap, and skill alignment (LLM-enhanced)
- Resume Handling: Uploaded resumes were stored and converted to clean
.txtand.pdfformats dynamically
Challenges we ran into
- Setting up and integrating the Gemini API for large-scale skill inference was tricky due to rate limits and output parsing.
- Designing a fair and balanced compatibility scoring system that made sense for both students and professors.
- Making sure resume uploads and LLM-powered inference remained performant and didn't overload the backend.
- Merging frontend/backend cleanly while keeping the app lightweight and responsive.
Accomplishments that we're proud of
- Built a full-stack AI-powered matchmaking engine from scratch within Bitcamp
- Successfully integrated Gemini LLM to infer skills from raw research interests
- Designed a dual-view system for both students and professors
- Created polished resume generation, scoring, and filtering capabilities — all hosted and demo-ready
- Delivered something that could genuinely benefit the UMD academic community
What we learned
- How to structure a full-stack application using React + Flask + MongoDB
- How to integrate LLM APIs into traditional software systems
- How to handle resume parsing and formatting at scale
- The power of semantic matching vs. keyword-based filtering
- The potential of AI in making academia more accessible and collaborative
What's next for TerpMatch: LLM-Powered Research Connector
- Add student-facing "Top Matches" dashboard with live updates
- Let professors customize skill requirements and use Gemini for custom role analysis
- Deploy to a public cloud platform (e.g., Streamlit Cloud, Render, or Railway)
- Integrate UMD authentication (CAS or SSO)
- Add a Gemini-powered chatbot to help students ask questions like “Which profs do ML in healthcare?”
We're excited to see where TerpMatch goes — and how it can help make UMD's vibrant academic network even stronger. 🐢

Log in or sign up for Devpost to join the conversation.