Inspiration
One of us has a family member who went to a cardiology appointment last year and came home completely confused. They had nodded along to everything the doctor said, but when they got home they didn't really understand what was wrong with them or what they were supposed to do about it. They were too embarrassed to call back and ask, so they just guessed. That really bothered us. When we found out that up to 80% of medical information is forgotten the moment patients leave the doctor's office, it all made sense. People aren't forgetting because they're careless. They're forgetting because medical language is confusing, appointments are stressful, and nobody is there to help them understand afterward. We wanted to fix that, so we built MedBridge.
What it does
MedBridge records your doctor visit and turns it into something you can actually understand. After your appointment, it gives you a plain-language summary of everything that was said, a step-by-step action plan with your medications and next steps, and an interactive body diagram where you can tap affected areas to learn more. We have a feature called "Dumb It Down Mode" that takes complicated medical terms and rewrites them in simple, everyday language. There's also an AI assistant you can ask follow-up questions to anytime, like "can I still play sports?" or "what happens if I miss a dose?" We also built a Symptom Journal where you can log how you're feeling each day between appointments, and a Read It To Me button that reads your whole summary out loud. That last feature is really helpful for elderly patients or anyone who just wants to listen on the drive home. Every summary also comes with a clear disclaimer that MedBridge is meant to support your understanding, not replace your doctor.
How we built it
We used a lot of different tools to bring MedBridge to life and honestly it was a really cool experience figuring out how they all fit together. For the AI brain of the app we used Claude and Gemini to power our plain-language summaries and the Dumb It Down Mode feature. We also used Open Router to manage and switch between different AI models depending on what the task needed, which made our AI layer a lot more flexible. Assembly AI handled our audio transcription and did a really impressive job picking up medical vocabulary and filtering out background noise from doctor visit recordings. For the actual building and coding we worked in VS Code and leaned heavily on 21 dev tools to speed up our workflow. N8n helped us connect all our different services together and automate the flow of data between them, like taking a finished transcript and automatically sending it through our summarization pipeline. For the design side we used Figma to plan out our UI before building it, and Framer to bring some of those designs to life quickly. We wanted the interface to feel clean and approachable especially for elderly users, so having proper design tools made a big difference. Sketchfab is where we sourced the 3D body model for our body visualization feature, which lets patients tap on affected areas to understand what is going on with their health. Stitch helped us pull different parts of our frontend together cleanly. Auto ML helped us on the machine learning side, letting us add smarter features without having to build models completely from scratch. We also used Antigravity during the build process which helped us move faster on some of the more complex parts of the app. Looking back it is kind of amazing how much we were able to build in one hackathon by connecting all these tools together in the right way.
Challenges we ran into
The hardest part was making sure the AI simplified things accurately. Medical jargon is complicated, and we had to be really careful that when we translated something into plain language we weren't accidentally changing the meaning. That took a lot of prompt engineering and testing across both Claude and Gemini to get right. Getting the transcription to work well was also tricky because real doctor visits have a lot of background noise and people sometimes talk over each other. Assembly AI handled most of it but we still had to tune things carefully for medical vocabulary specifically. We also really struggled with the design. We wanted it to be simple enough for an elderly person to use without any instructions, but we kept making it too complicated on the first few tries. Connecting all our different tools together through N8n was another challenge because there were a lot of moving parts and getting the data to flow correctly between them took a lot of troubleshooting. Overall every challenge we ran into reminded us how much responsibility comes with building something in the healthcare space.
Accomplishments that we're proud of
Honestly we are just proud that it actually works. You can upload audio from a doctor visit and in under 30 seconds get a plain-language summary, an action plan, and an interactive 3D body diagram all ready to go. We are really proud of the Symptom Journal because it turned MedBridge from something you use once into something you keep coming back to between appointments. The Read It To Me feature is something we are proud of too. It was pretty simple to build but it makes the app feel genuinely caring, especially for users who struggle with reading. We are also proud of how we connected so many different tools together into one smooth experience. Using Claude, Gemini, Assembly AI, N8n, Sketchfab, and all the rest together felt like we were building something way bigger than a typical hackathon project. More than anything though we are proud that we built something we would actually give to our own family members and trust them to use.
What we learned
We learned that building something in healthcare is a lot more serious than we expected going in. Every decision we made had a real person on the other end of it, and that made us think a lot harder about every feature and every word. We learned how to work together under pressure, make decisions quickly, and let go of ideas that weren't working even if we liked them. We also learned a ton technically. Before this we had never used half of these tools and figuring out how to connect them all together taught us more in one weekend than we expected. The biggest thing we learned is that the best products start with a real person in a real moment of confusion, and you work backwards from there. We came in wanting to build something cool and we left wanting to build something that actually helps people. That shift felt important.
What's next for MedBridge
Right now our focus is making sure the core product is accurate and trustworthy. We want to keep refining Dumb It Down Mode and make sure every AI response is safe and reliable. After that we want to pilot with real hospitals and clinics, add support for multiple languages so we can help non-native speaking patients too, and eventually integrate with Electronic Health Record systems so MedBridge fits naturally into how doctors and patients already interact. Further down the road we want to add predictive features that can spot concerning health trends over time before they become serious problems. The long term goal is simple. We want every patient, no matter their age, language, or background, to leave every doctor's appointment actually understanding what is going on with their health. We think that should be a basic right, and MedBridge is how we start making it one.
Log in or sign up for Devpost to join the conversation.