Inspiration

Most students don’t fail because they aren’t smart — they fail because learning isn’t built for how the brain remembers. We watched classmates scroll through endless notes, reread PDFs five times, and still feel unprepared. Some faced language barriers. Some faced accessibility barriers. Some simply didn’t know how to study effectively.

We wanted to fix that , not with another study tool, but with something that makes learning active, accessible, and personalized. That spark became Quillium.

What it does

Quillium turns any study material into smart quizzes, adaptive flashcards, and multilingual learning experiences. It helps users practice until concepts stick — using spaced repetition, active recall, and cognitive science-backed learning patterns.

It supports 40+ languages, offers text-to-speech accessibility, and includes progress analytics so learners can see measurable growth — not guess it.

In short: 📄 Upload → 🔍 Understand → 🎓 Master.

How we built it

We combined:

PyMuPDF for extracting clean text

HuggingFace T5 for conceptual question generation

NLTK WordNet for context-aware distractors

MarianMT (Helsinki NLP) for multilingual translation

pyttsx3 for audio accessibility

Streamlit + Plotly for a simple, interactive UI and dashboards

Every component was built with one guiding rule:

If a student can’t use it easily, it doesn’t belong.

Challenges we ran into

Making AI-generated questions feel human and meaningful, not random

Ensuring translations preserved accuracy, tone, and academic intent

Designing accessibility in the core, not as a “later” feature

Balancing performance with model complexity without sacrificing UX

We broke things, rebuilt them, and repeated until it felt right.

Accomplishments that we're proud of

Building a tool that works for different learners, languages, and abilities

Turning a passive study process into a personalized learning cycle

Blending AI + cognitive psychology + inclusive design into a single product

Watching testers go from overwhelmed → confident

What we learned

We learned how to bring NLP, multilingual AI, and accessibility principles together into a cohesive learning experience. We explored the science behind retention and how deliberate learning techniques improve memory. More importantly, we learned that thoughtful design matters — students need clarity, not complexity. This project taught us how technology becomes meaningful when it empowers every type of learner.

What's next for Quillium-AI Powered Inclusive Learning Assistant

We’re expanding Quillium into a fully adaptive and accessibility-driven learning ecosystem. Next, learners will be able to scan handwritten notes using OCR, study offline through a dedicated mobile app, and experience a personalized AI tutor that adapts difficulty and pacing to their learning style.

We’re also enhancing accessibility—with voice-response quizzes, screen-reader-friendly navigation, and sign-language-ready caption templates for deaf and hard-of-hearing learners. At the same time, text-to-speech will evolve into natural voice AI reading support for blind and visually impaired students.

Finally, we’re integrating seamless export into platforms learners already use—like Notion, Anki, Google Forms, and LMS systems.

Built With

+ 26 more
Share this project:

Updates