Inspiration
Healthcare can be stressful when something feels off, but not urgent enough for the ER. At that moment, most people are left with scattered online searches, uncertainty, and no clear sense of what to do next. We built Baymax to make that experience feel calmer, more supportive, and more actionable.
Baymax was inspired by the idea of making healthcare feel more human, accessible, and less stressful. We wanted to build something that could guide people through simple at-home health check-ins in a calm, friendly way, similar to having a supportive assistant always available. Instead of making users interpret symptoms on their own, we wanted Baymax to combine screening, history, and guidance into one experience. Our end goal (which could not be fully realized in this app) would be to use Baymax as a tool that could automate check-ins for doctors appointments by analyzing data that could be taken either at home or with stations at a pharmacy.
What it does
Baymax is a mobile health assistant that helps users complete guided self-checks, store their health information locally, and receive AI-supported feedback. Users can create a secure profile, track symptoms and screening history, and interact with an AI doctor chat for follow-up guidance. The app supports multiple screening flows such as throat checks, heart sound analysis, eyesight and eye tracking tests, hearing checks, reaction testing, and other guided wellness evaluations.
How we built it
We built Baymax as an Expo and React Native mobile app using TypeScript. We used Gemini API for AI doctor-style responses and certain image-based analysis tasks, and we added a local Ollama-powered chat backend for Baymax’s care assistant experience. On the backend side, we built Python-based ML services using Flask and FastAPI for specialized health analysis, including heart sound classification (fine-tuned OpenAI Whisper) and throat image prediction (specialized CNN architecture). Together, these pieces formed a full-stack mobile health demo that combines app development, AI integration, and custom model serving.
Challenges we ran into
One of the biggest challenges was integrating many different technologies into a single cohesive product. We had to connect a mobile frontend with local and remote AI services, manage device permissions for camera/audio access, and make sure the app could communicate with backend services running on local machines during development. We also had to balance ambitious health-related features with hackathon time constraints, especially when working with multiple AI and ML pipelines for chat, audio, and image analysis.
Accomplishments that we're proud of
We’re proud that Baymax feels like a real end-to-end product instead of a single hackathon demo. We built local profiles, multiple health screening flows, saved history and doctor handoff features, a local Ollama care chat, and custom heart and throat classification models, then connected them so each part adds to the same patient experience rather than existing as separate features. What makes us most proud is that the app does not just show isolated AI or ML outputs, it turns them into something usable by carrying context across screenings, chat, and follow-up views in one cohesive system.
What we learned
We learned a lot about building end-to-end AI applications that span mobile, backend, and machine learning systems. This project gave us practical experience working with React Native and Expo, integrating Gemini and Ollama, serving ML models with Python APIs, and designing around real-world issues like latency, permissions, and local networking. We also learned how difficult and important it is to design health-related tools responsibly, especially when trying to keep the experience both useful and safe.
What's next for Baymax
Our next step is to make Baymax more accurate, polished, and production-ready. We want to improve model performance, strengthen the reliability of the screening flows, and refine the user experience across devices. We’d also like to expand clinician handoff features by giving the opportunity to feed in data from more accurate tests like modern echocardiograms, EEG machines, etc whenever patients do end up visiting hospitals. Eventually, we hope to use this platform to make healthcare better, cheaper, and more accessible by saving the time of doctors and allowing people to do reliable health check-ups on their own whenever they want to. We also hope to improve personalization over time, and explore how Baymax could evolve from a hackathon prototype into a more scalable health support platform.
Built With
- android
- expo.io
- fastapi
- flask
- google-gemini-api
- ios
- node.js
- ollama
- python
- pytorch
- react-native
- tensorflow
- typescript
- web
Log in or sign up for Devpost to join the conversation.