Inspiration We kept thinking about how many people in underserved communities don't know where to turn when health issues pop up. Picture a parent at midnight worried about their kid's rash, or someone feeling chest pain but unsure if it's serious enough for the ER. These moments of uncertainty can lead to delayed care or expensive unnecessary visits. We wanted to build something that could help anyone with a smartphone get quick, reliable guidance about their health concerns and point them toward the right kind of care. What it does HealthFindr AI is basically a smart medical triage assistant in your pocket. You can either upload a photo of a skin condition or describe your symptoms in plain text. The system analyzes what you share, figures out how serious it might be, and tells you whether you should head to a primary care clinic, urgent care, a specialist, or the emergency room. But here's the cool part—it doesn't just give advice and leave you hanging. It automatically creates a Google Maps link to find the recommended type of facility near you, so you can actually take action right away. How we built it We went with a serverless setup using Deno, which keeps things fast and scalable. The backend has two main functions—one for analyzing images and another for processing symptom descriptions. Both tap into Google's Gemini 2.5 Flash AI model, which is great at understanding both images and natural language. For the image side, users upload photos along with their location, and the AI examines them like a doctor would, looking at severity and making recommendations. With symptoms, we taught the AI to sort concerns into mild, moderate, and severe categories, each linked to different facility types. The frontend is built with TypeScript, HTML, and CSS to keep the interface clean and conversational. We wanted it to feel supportive, not cold and clinical. Challenges we ran into Getting the AI to consistently nail severity levels was harder than we expected. Sometimes it would send everyone to the ER just to be safe, other times it would be too relaxed about genuinely concerning symptoms. We went through many rounds of tweaking our prompts to find the sweet spot between being helpful and appropriately cautious. Parsing the AI's conversational responses into structured data was another headache. Since it talks naturally rather than spitting out categories, we had to build smart keyword detection that could spot urgency without freaking out over every mention of "pain." Getting the Google Maps integration smooth also took time—making sure addresses formatted correctly and facility types matched real search terms required several tries. Accomplishments that we're proud of We're really happy that HealthFindr AI handles both images and text, giving people options for how they want to communicate. The automatic map links feel like a real breakthrough—turning "you should see a doctor" into "here's where to go" makes a huge difference. Our severity detection has gotten pretty sophisticated too, catching urgent situations while staying calm and compassionate. But honestly, what we're most proud of is creating something that could genuinely help people who don't have easy access to healthcare. It works on any device, doesn't require medical knowledge, and provides guidance exactly when someone needs it most. What we learned Building healthcare AI taught us that accuracy is only half the battle—trust and empathy matter just as much. Every word in our prompts influences how the AI responds to someone who might be scared or hurting, so we had to be incredibly thoughtful about our language. We also gained massive respect for medical professionals. Triaging symptoms looks simple from the outside, but it involves complex decision-making that takes years of training. Our system can help, but we learned to be humble about what it can and can't do, and always push people toward professional care. On the tech side, serverless architecture proved perfect for this kind of application. The scalability means we could serve thousands of users at once without worrying about servers crashing—crucial for a tool meant to help vulnerable populations. What's next for HealthFindr AI We've got big plans. First up is multilingual support so language doesn't block anyone from getting help. We're also looking into partnerships with community health clinics and free medical services to recommend actually affordable options, not just any facility. A follow-up feature that checks in after visits would help us improve over time. We'd also love to add telemedicine integration—connecting users with volunteer doctors or video consultations for people who can't easily travel to appointments. Long-term, we're thinking about tracking anonymized health patterns to spot disease outbreaks in underserved areas and expanding beyond skin conditions to specialized care for kids, elderly patients, and chronic conditions. HealthFindr AI started as a triage tool, but we see it growing into a full healthcare access platform for communities that need it most.
Log in or sign up for Devpost to join the conversation.