What Inspired Us

We’re two of juniors at UCF juggling 18-credit semesters, part-time jobs, and extracurriculars—and we realized we weren’t alone. In late spring 2025, between study sessions at Memory Mall and late-night group projects in the Engineering II building, we kept hearing the same frustrations:

  • “I can’t find the learning objectives buried in this 20-page PDF.”
  • “I don’t know how to break this syllabus into study chunks.”
  • “I wish I could quiz myself on the exact content my professor cares about.”

One evening, over a stack of takeout boxes in the SPARK Makerspace, we sketched out an idea: What if we could turn any syllabus into an interactive study guide—complete with quizzes and an AI tutor—without re-typing a single word? That moment sparked Syllab.AI.


What We Learned

Building Syllab.AI was as much about personal growth as it was about code:

  1. Natural Language Processing Basics
    • We dove into PDF parsing libraries and discovered how to extract headings, bullet lists, and learning objectives reliably.
  2. End-to-End Web Architecture
    • From designing MongoDB schemas for “Chapter” and “QuizQuestion” documents to wiring up React-based pages, we solidified our full-stack chops.
  3. AI Prompt Engineering
    • Crafting prompts that generate high-quality, varied quiz questions taught us the nuances of temperature, system vs. user messages, and few-shot examples.
  4. DevOps & Deployment on GCP
    • Configuring App Engine, tuning PM2 for zero-downtime restarts, and setting up CI/CD pipelines with GitHub Actions gave us hands-on experience with production workflows.

How We Built Syllab.AI

  1. Syllabus Uploader & Parser
    • Used Multer to handle file uploads and pdf-parse + custom regex heuristics to detect chapters, objectives, and key terms.
  2. Backend API
    • Built on Node.js + Express, with Mongoose models for Users, Courses, Chapters, and QuizQuestions.
    • JWT authentication secured every endpoint; Nodemailer handled verification emails.
  3. Quiz Engine
    • Integrated OpenAI’s GPT-4o mini: we batch-sent chapter summaries and “Generate 5 multiple-choice questions” prompts, then post-processed the output into our schema.
  4. Frontend Interface
    • A React/TypeScript SPA using React Router for page navigation and React-Bootstrap for mobile-responsive layouts.
    • The interactive dashboard visualizes progress and quiz scores using simple SVG charts.
  5. AI Chat Assistant
    • Leveraged a context-window architecture: every user message is sent alongside the current chapter summary, so the assistant “knows” your course.

Challenges We Faced

  • Parsing Diverse Syllabus Formats
    • Professors upload everything from Word-exported PDFs to scans of hand-annotated pages. We combined NLP techniques with manual fallback rules to handle edge cases.
  • Maintaining Chat Context at Scale
    • Early on, our chatbot responses would lose the thread after 3–4 messages. We solved it by batching recent messages into a single prompt and trimming older context.
  • Optimizing Quiz Generation Latency
    • Generating questions one‐by‐one led to unacceptable delays. We refactored to send chapter batches in parallel and cache results in Redis.
  • Authentication & Email Deliverability
    • Setting up SPF/DKIM for our Nodemailer-driven verification emails took more trial and error than we’d like to admit—sorry, UCF inboxes!
  • Time Constraints
    • Balancing midterms, project deadlines, and hackathon sprints meant we had to prioritize an MVP feature set and iterate quickly every evening.

Through it all, we learned that real-world problem solving is about people as much as technology. Syllab.AI isn’t just code—it’s our way of helping Knights study smarter, not harder. Future semesters, we hope to see it lighten the load on countless UCF students just like us.

Built With

Share this project:

Updates