Inspiration
Every year, thousands of people delay seeking medical help simply because they don’t realize how serious their symptoms are. In Ontario alone, patients wait up to 20+ hours in ERs, and nearly 30% of appointments are missed—many due to forgetfulness. On top of that millions of patients face problems when trying to accurately describe their symptoms during medical visits. Many forget important details or struggle with language barriers, leading to miscommunication, delayed diagnoses, and lower-quality care. Our inspiration came from the need to reduce these problems by creating a tool that helps people track their health, understand warning signs, and connect with care earlier—before things become emergencies.
What it does
Medical Mole is a multilingual, AI-powered health companion that helps patients track symptoms, analyze health trends, and clearly communicate their health status to healthcare providers. Key features include:
- Symptom Calendar: An intuitive and easy interface for logging daily symptoms, notes, and observations.
- AI Chatbot: Patients can log daily medical updates; the AI extracts key symptoms, and structures the data.
- Diagnosis Suggestion: A custom-trained machine learning model suggests potential diagnoses based on symptom patterns, duration and severity.
- Doctor Summary Report: Generates structured ways to show doctors your symptoms to help with diagnoses.
- Multilingual Support: Supports input and output in multiple languages to break down communication barriers.
The app improves clinical efficiency, enhances diagnostic accuracy, bridges language gaps, and empowers patients to take control of their health.
How we built it
We built a Medical Mole with a modern tech stack. The front end is developed using React, Vite, and TypeScript, providing a fast, responsive, and type-safe user interface. On the backend, we use a compact computer running Python, which hosts our local machine learning models. To enable multilingual natural language processing, the app integrates Gemini, allowing it to understand and respond in multiple languages.
Challenges we ran into
- Training the AI chatbot to accurately understand and switch between multiple languages in real-time was a significant technical hurdle.
- Striking a balance between an easy-to-use interface and powerful functionality—especially for symptom tracking and summary generation—required several design iterations.
- Time limitations meant we prioritized core features over advanced functionalities like full voice-to-text support.
Accomplishments that we're proud of
- Successfully integrated a multilingual AI assistant capable of understanding complex symptom descriptions.
- Created a user-friendly symptom calendar that patients can use to monitor their health trends over time.
- Implemented a report generation system that produces doctor-ready summaries with multilingual translation.
What we learned
One of the most important things we learned during this project is how often communication between patients and doctors breaks down—not due to a lack of concern, but because patients struggle to explain their symptoms clearly. People often forget key details, misinterpret their own symptoms, or face language barriers that make it difficult to express what they're experiencing. This can lead to delayed diagnoses, confusion, or even incorrect treatment plans. Through building Medical Mole, we saw how structured symptom tracking, natural conversation with an AI assistant, and translated reports can significantly improve the flow of information between patients and healthcare providers. By helping patients tell their stories more clearly—regardless of language or memory limitations—we can help close the gap in understanding and improve the overall quality of care.
What's next for Medical Mole - TerraHacks Submission
Medical Mole began as a simple vision: ensuring every patient is heard clearly, even when words fail. Over the hackathon weekend, we turned that vision into a functional solution with real potential to improve healthcare communication and outcomes. Moving forward, we aim to enhance multilingual capabilities, integrate with clinical systems, and develop advanced alert features—working towards a future where language and memory are no longer barriers to quality care.
Log in or sign up for Devpost to join the conversation.