Inspiration
We wanted to create something inclusive — a fitness app that works for people who are visually impaired. Most workout apps rely on screens, which just isn’t accessible. So we built MoveMate to give people the ability to work out safely, confidently, and independently using only their voice and movement.
What it does
MoveMate is a voice-guided workout web app for the visually impaired. It uses audio instructions, voice commands, and real-time body tracking to help users complete basic exercises like jumping jacks, squats, and knee raises — all hands-free and screen-free.
How we built it
We used HTML, CSS, and JavaScript for the frontend. For voice input, we used the Web Speech API. For pose detection and body tracking, we integrated TensorFlow.js with MoveNet. The UI was designed with accessibility in mind: high contrast, large fonts, and audio-first flow.
Challenges we ran into
- Making sure browsers allowed audio to autoplay without user interaction
- Delaying speech recognition until audio instructions were finished
- Designing for people who can’t rely on visuals
- Handling real-time body tracking without making things too complex or distracting
Accomplishments that we're proud of
- Fully working voice navigation
- Real-time form tracking
- A clean, accessible, and audio-based design
- A hands-free experience built for a community that’s usually left out
What we learned
We learned how to design without relying on visuals, how to work with browser limitations for voice and audio, and how much thought goes into truly accessible experiences.
What's next for MoveMate
- Add more exercises and difficulty levels
- Let users track their progress
- Test with real users in the visually impaired community and iterate from there
Built With
- blazepose
- css
- html
- javascript
- node.js
- scatteergl
- tensorflow
- web

Log in or sign up for Devpost to join the conversation.