π©Ί Inspiration
Medical reports are filled with complex terminology, often leaving patients confused and anxious. Not everyone has the health literacy or access to doctors to fully understand these documents. We created MediSpeak to bridge that gap using AI β making medical reports simple, spoken, and interactive.
π οΈ What it does
MediSpeak allows users to:
- Upload medical reports
- Receive an instant plain-language summary
- Get the summary read out loud using voice AI
- Ask follow-up questions via a chatbot
- View and manage past summaries via a saved reports dashboard
π§ How we built it
We used Bolt.new as the no-code foundation. Key integrations include:
- OpenAI GPT-4 β for understanding and simplifying medical reports
- ElevenLabs Voice AI β for reading summaries aloud
- Supabase β for authentication and saving reports
- Netlify β for deployment The UI was designed for accessibility: clean layout, responsive design, and easy-to-use flow.
βοΈ Challenges we ran into
- Balancing clarity with medical accuracy in summaries
- Making voice generation fast and reliable
- Managing multi-step flows in a no-code environment
- Ensuring private, secure handling of health-related documents
π± What we learned
- AI becomes truly impactful when paired with user-centric design
- Bolt.new can build production-ready, AI-integrated apps without writing code
- Accessibility features like voice and chatbot make a huge difference in real-world adoption
π Whatβs next
- Add multilingual support for Indian/regional users
- Integrate with telemedicine/clinic software
- Add a smart suggestion engine for report-specific follow-up questions
- HIPAA-compliant privacy layer (for production use)
Built With
- ai
- bolt.new
- gpt-4
- javascript
- netlify
- node.js
- openai
- react
- supabase
- typescript
Log in or sign up for Devpost to join the conversation.