🧠 Inspiration
Many people experience discomfort or early symptoms but hesitate to seek help due to uncertainty, anxiety, or lack of clear guidance. We wanted to bridge that gap by creating a tool that not only predicts potential conditions based on symptoms but also empowers users with actionable advice and communication support — especially for those preparing for a doctor visit.
💡 What it does
SymptoWise is an AI-powered symptom checker that allows users to:
- Select symptoms via a searchable, multi-select interface
- Receive top 3 condition predictions with confidence levels
- View detailed condition descriptions and immediate precautions
- Chat with an integrated LLM assistant that helps phrase symptoms and recommend next steps
The platform combines predictive modeling with natural language guidance to improve user confidence and pre-consultation clarity.
🛠️ How we built it
- Frontend: React + TypeScript + TailwindCSS for a modular and responsive card-based UI
- Charts & Animations: Recharts for visualization, Framer Motion for UI transitions
- Symptom Input:
react-selectfor searchable, tag-style symptom selection - Backend: Python API (FastAPI) serving a trained classification model (
joblib) - LLM Assistant: Ollama running a local instance of LLaMA 3 for generating care guidance
- Markdown Rendering:
react-markdownfor formatting chatbot responses
🧩 Challenges we ran into
- Ensuring the ML model output was both accurate and user-friendly
- Handling local LLM responses with markdown formatting and scroll-safe UI
- Designing an interface that was intuitive, especially for non-technical or anxious users
- Efficiently passing state between components while keeping the UI reactive and modular
🏆 Accomplishments that we're proud of
- Building a fully working AI health assistant in just a few days
- Integrating a local LLM (Ollama) into the frontend for real-time care guidance
- Designing a smooth, mobile-friendly UI that feels polished and supportive
- Keeping both predictive accuracy and user trust central to our experience
📚 What we learned
- How to integrate Ollama locally into a React-based frontend
- Best practices for combining ML predictions with conversational UX
- Importance of thoughtful UX design in health tools — especially around clarity, tone, and accessibility
- Managing and visualizing confidence scores in an interpretable way
🚀 What's next for SymptoWise – AI Symptom Checker
- 🗣️ Add voice-to-text support for symptom input
- 🤖 Fine-tune the LLM to provide more medically-grounded advice
- 📱 Launch a PWA version for mobile access
- 📄 Export personalized care summaries as PDFs
- 🌐 Multilingual support for greater accessibility
- 🧑⚕️ Build an API for clinics to integrate SymptoWise into virtual triage flows
Built With
- axios
- csv
- fastapi
- framer-motion
- joblib
- llama-3
- node.js
- ollama
- pandas
- python
- react
- react-markdown
- react-select
- recharts
- tailwind-css
- typescript
- vite
Log in or sign up for Devpost to join the conversation.