Inspiration
As a frequent runner, I often injure my knee from overuse. However, going to physical therapy is a hassle in itself considering that I'm already balancing school, searching for co-ops, and keeping on track of my training schedule. One day as I was scrolling through Strava, a thought came to me: “Why can’t rehab feel as slick as my fitness apps?” That frustration sparked RehabPal—an app that blends computer-vision, fast feedback, and friendly AI so anyone can rehab smarter, anywhere.
What it does
RehabPal provides patients undergoing physical therapy with ease of access. We know that you're busy, no need to make that trip into the clinic! Just simply open up the app and have your doctor's tailored exercises sent directly to your phone. We even have a built-in form checker to make sure you are well on your way to recovery.
How we built it
The mobile shell is React Native with Expo Router, Vision-Camera for 60 fps capture, and local caching via FileSystem. A FastAPI backend talks to Postgres for metadata and MinIO for raw video. YOLOv11-Pose tracks 17 joints per frame; a Python worker scores form in real time, then prompts Gemini 2.0 to return JSON scores and plain-English tips. Firebase Auth custom claims keep doctors and patients in their own lanes, and a push-notification hook lets users know when feedback is ready.
Challenges we ran into
Getting the doctor’s reference clip and patient camera perfectly in-sync (and invisible after the 5-sec countdown) took more coffee than code. We also ran into a lot of errors with expo and compiling as well. There was this one particular moment where we sent almost an hour on trying to fix a line that was right but failed at run time.
Accomplishments that we're proud of
We spun up a FastAPI service, Postgres + MinIO storage, and a Vision-Camera React Native app that talk to each other in real time which was no easy task. Every video a doctor uploads is auto-processed with YOLOv11-Pose; we store key-points for every 5th frame and stream feedback to the patient while they exercise. Gemini-powered coaching. We wired Google Gemini 2.0 into the feedback loop so patients get an instant 0-100 “form score” plus 3 actionable tips, all generated from raw joint data.
What we learned
We learned that passing Gemini clean JSON key-points slashed token usage and produced razor-sharp advice. YOLO taught us how to reason about coordinate space, confidence scores, and why 5 fps is the sweet spot for rehab. We also learned new Expo tools such as Expo’s Vision-Camera which let us push frame-by-frame feedback with < 100 ms latency—patients see a green indicator the instant they fix their form.
What's next for RehabPal
We hope to be able to integrate with wearable devices such as heart rate monitors as well as movement sensors so that it can provide feedback for doctors. We also want to include a quick form that patient can fill out at the end on how they found the exercise (e.g. too hard, too easy). This will help doctors gather even more information to improve their exercises so that patients will have
Built With
- expo.io
- fastapi
- firebase
- gemini
- minio
- postgresql
- python
- react-native
- typescript
- yolo
Log in or sign up for Devpost to join the conversation.