🩺 Inspiration

Medical reports are filled with complex terminology, often leaving patients confused and anxious. Not everyone has the health literacy or access to doctors to fully understand these documents. We created MediSpeak to bridge that gap using AI β€” making medical reports simple, spoken, and interactive.

πŸ› οΈ What it does

MediSpeak allows users to:

  • Upload medical reports
  • Receive an instant plain-language summary
  • Get the summary read out loud using voice AI
  • Ask follow-up questions via a chatbot
  • View and manage past summaries via a saved reports dashboard

🧠 How we built it

We used Bolt.new as the no-code foundation. Key integrations include:

  • OpenAI GPT-4 – for understanding and simplifying medical reports
  • ElevenLabs Voice AI – for reading summaries aloud
  • Supabase – for authentication and saving reports
  • Netlify – for deployment The UI was designed for accessibility: clean layout, responsive design, and easy-to-use flow.

βš”οΈ Challenges we ran into

  • Balancing clarity with medical accuracy in summaries
  • Making voice generation fast and reliable
  • Managing multi-step flows in a no-code environment
  • Ensuring private, secure handling of health-related documents

🌱 What we learned

  • AI becomes truly impactful when paired with user-centric design
  • Bolt.new can build production-ready, AI-integrated apps without writing code
  • Accessibility features like voice and chatbot make a huge difference in real-world adoption

πŸš€ What’s next

  • Add multilingual support for Indian/regional users
  • Integrate with telemedicine/clinic software
  • Add a smart suggestion engine for report-specific follow-up questions
  • HIPAA-compliant privacy layer (for production use)

Built With

Share this project:

Updates