Inspiration
We were inspired by the gap between complex lab reports and patients’ ability to understand them. People often leave hospitals confused by numbers without clear meaning or actionable next steps. We wanted to bridge that gap with AI.
What it does
NutriScope AI extracts values from lab PDFs, interprets out-of-range results, and generates clear explanations. It then suggests a personalized meal plan—powered by retrieval-augmented generation (RAG)—to help users take actionable steps toward better health.
How we built it
We used Flask for the backend, React + Material UI for the frontend, and integrated OCR to parse PDFs. Out-of-range values are annotated with a knowledge base (FAISS + Sentence Transformers). We then used Groq LLMs for AI-driven summaries and meal plan generation.
Challenges we ran into
Setting up reliable OCR parsing across different lab formats.
Ensuring accurate flagging of high/low/borderline values.
Integrating Groq’s API for context-aware responses while keeping performance fast.
Managing async tasks and data flow between frontend and backend.
Accomplishments that we're proud of
A smooth, modern UI with easy uploads and instant reports.
End-to-end AI pipeline: OCR → Analysis → RAG → Summaries + Meal Plan.
Achieving personalized, medically relevant outputs instead of generic text.
What we learned
The importance of explainability in AI—users want clarity, not jargon.
How to design human-centered healthtech with trust and usability in mind.
Effective collaboration across backend, frontend, and AI systems in limited time.
Built With
- celery
- docker
- flask
- github
- groq-llms
- javascript
- jwt-authentication
- material-ui
- postgresql
- pymupdf
- python
- react
- redis
- retrieval-augmented-generation-(rag)
- sentence-transformers
- sqlalchemy
- sqlite
- tesseract-ocr
- typescript
- vite
- zustand

Log in or sign up for Devpost to join the conversation.