Inspiration
The inspiration came from our own experience as students — constantly struggling to keep up with fast-talking professors while trying to stay focused and actually learn.
What it does
With NoteLab, you can stay present in the lecture while our tool records and transcribes the audio in real-time. After class, we enhance your own typed notes with AI, adding images, references, and smart organization — all while preserving your original notes with precise timestamps tied to the lecture audio. NoteLab isn't just a smarter notebook — it's your personal learning assistant that keeps you engaged and helps you learn better.
How we built it
We developed NoteLab through three main stages: Frontend Development, Backend Development, and AI/ML Integration.
For the frontend, we started by designing visual mockups to map the desired UI/UX flow. We then built the interface using Angular, integrating packages for the notes editor, slide viewer, and real-time audio transcription powered by Deepgram.
At the same time, we developed the backend using FastAPI and MongoDB, creating endpoints and models for Users, Courses, and Lecture Materials — including lecture titles, transcripts, Google Slides, user-typed notes, audio recordings, and AI-enhanced notes.
For the AI enhancement, we leveraged Google Gemini to generate enriched user notes. We refined our prompts using techniques like prompt chaining, few-shot learning, and chain-of-thought reasoning, allowing us to create contextually aware notes that integrate both the lecture transcription and the user’s original notes.
Challenges we ran into
Some key challenges we faced included learning and adapting to new frameworks like Angular, handling large file and audio processing on both the MongoDB backend and the Angular frontend, and fine-tuning Google Gemini to accurately enhance user notes while also sourcing relevant images.
Accomplishments that we're proud of
We're proud to have built a tool that truly supports students in their learning journey. We successfully implemented live audio transcription, enhanced user-generated notes with AI, and created a platform that keeps all lecture materials organized in one place.
What we learned
Throughout this project, we gained valuable experience in prompt engineering and working with generative AI using Google Gemini. We also learned how to build with new frameworks like Angular for the frontend and FastAPI for the backend.
What's next for NoteLab
We plan to add auto-generated quizzes based on users’ notes and AI-enhanced notes from the live audio transcription. We also aim to introduce citation features by linking specific segments of the lecture audio, and develop a chatbot that helps users explore lecture topics more deeply by referencing and citing the professor’s voice from the recorded audio.
Log in or sign up for Devpost to join the conversation.