Inspiration

Job search platforms are good at listing opportunities, but weak at helping users decide what to do next.

From our own experiences as students and early-career job seekers, the hardest part of job hunting is not finding jobs, but understanding:

  • Why a role is or isn’t suitable
  • What exactly is blocking eligibility
  • Which skill is worth learning first
  • When it actually makes sense to apply

Most platforms rely on keyword matching or opaque AI scores. We wanted to build a system that turns job search into a clear, explainable, and actionable decision process, grounded in how real hiring actually works.


What it does

Our platform is a constraint-aware AI career readiness system that connects:

  • resume understanding
  • eligibility-aware job matching
  • explainable skill gaps
  • and a market-grounded upskilling roadmap

into a single workflow.

Instead of asking users to trust a black box, the system shows why a job fits, what is blocking eligibility, and how learning a specific skill changes real job outcomes.


How we built it

Resume Understanding with Privacy-First AI

Users upload their resume, which is processed using OCR to extract raw text.
Before any AI analysis, we apply PII redaction using regex (emails, phone numbers, IDs, addresses), ensuring that only redacted text is sent to the LLM.

Rather than treating skills as binary signals, we generate a resume summary.
This captures nuance such as familiarity, hands-on exposure, and proficiency, making job matching more accurate than simple skill presence.


Normalised Job Database with Constraint-Aware Matching

We ingested and normalised all six CSV job datasets into a single master job database.

Because requirements and good-to-have skills are often buried inside job descriptions, we extracted them from the JD text and augmented the database with two new columns:

  • Requirements
  • Good-to-have skills

This separation serves two purposes:

  1. It grounds AI recommendations in explicit job requirements, preventing hallucinated skill suggestions.
  2. It enables constraint-aware matching. Non-negotiable requirements such as citizenship, degree qualifications, or work authorisation act as eligibility gates.

Even high semantic similarity from nice-to-have skills cannot override missing requirements, mirroring real hiring logic.


Skill Catalog with Verified Learning Resources

We built a comprehensive skill catalog mapped to legitimate, up-to-date learning resources.

Course recommendations are generated using GPT-OSS with browser search, acting as a lightweight RAG system. This ensures accurate links, current content, and no fabricated sources.


Job Search with Explainable Analytics

For each job search, the system retrieves the top-(k) relevant roles and provides job-specific explanations, including:

  • Strengths: required skills the user already has
  • Blocking skills: unmet requirements preventing eligibility
  • An LLM-generated role summary for quick understanding

This replaces static readiness labels with transparent, job-level reasoning.


Upskilling Roadmap with Market Signals

Saved jobs become a live career readiness tracker.

The system:

  • compares user skills against saved job requirements
  • surfaces missing skills
  • links each missing skill to courses from the skill catalog

Each skill includes market insights, such as percentage of saved jobs requiring the skill (e.g. 66%)

Although technically implemented using count-based queries, these signals strongly influence learning prioritisation and decision-making.


Challenges we ran into

  • Balancing AI reasoning with hard eligibility constraints
  • Normalising inconsistent job description formats
  • Avoiding misleading readiness signals in semantic search
  • Handling personal resumes responsibly within hackathon constraints

Accomplishments that we're proud of

  • Built an explainable alternative to black-box job matching
  • Anchored AI outputs to explicit job requirements
  • Designed a system where eligibility cannot be overridden by similarity scores
  • Turned job saving into a feedback loop between learning and opportunity

What we learned

  • Explainability builds trust more effectively than raw accuracy
  • Hard constraints must coexist with semantic AI
  • Market context significantly changes user behaviour
  • Responsible AI can be lightweight yet impactful

Built With

Share this project:

Updates