🔥 Inspiration This project was inspired by a personal need — me, myself — to have a smart, interactive learning assistant that could help with studies, revisions, and concept clarity. I wanted to create something that could help me (and others) study more effectively, anytime and anywhere.
What I Learned - While building Study Buddy, I learned a lot about: Large Language Models (LLMs) and how to use them through Hugging Face Transformers, Hugging Face Spaces and deploying apps using Gradio, basics of prompt engineering, using lightweight models like Phi-3-mini-4k-instruct to optimize performance on limited hardware, and troubleshooting gated models, hardware compatibility, and runtime issues. It wasn’t just a technical journey — I also learned how to design a clean, user-friendly interface and handle deployment errors effectively.
How I Built It - I used Gradio's ChatInterface to create a minimal, beautiful chatbot UI. The backend uses the microsoft/Phi-3-mini-4k-instruct model loaded via Hugging Face's pipeline. Deployed on Hugging Face Spaces with CPU hardware (keeping it light and accessible). I followed official documentation, GitHub examples, and community forums to learn step by step.
Challenges Faced - Initially tried using Meta-LLaMA-3, but faced access and hardware limitations. Solved configuration errors related to README.md metadata and gated models. Faced dependency issues like missing bitsandbytes package and learned to troubleshoot them. Iteratively debugged the app to fix runtime and loading errors.
Built With
- bolt
- gradio
- huggingface
- phi-3-mini-4k-instruct(llm)
- python
Log in or sign up for Devpost to join the conversation.