Inspiration
Inspired by the need for accessible, hands-free control, we aimed to create a tool for users with limited mobility to navigate screens easily using intuitive hand gestures.
What it does
Manusify enables touchless screen navigation. Through hand gestures, users can scroll and interact with content on-screen, promoting accessibility for those with physical limitations.
How we built it
We used OpenCV and MediaPipe to capture and process video input, analyzing hand gestures for real-time scrolling and navigation commands, all optimized for smooth, responsive control.
Challenges we ran into
Achieving accurate gesture recognition without false triggers was challenging. Fine-tuning sensitivity and adapting the algorithm for various hand motions took significant iteration.
Accomplishments that we're proud of
We’re proud of creating a responsive, hands-free solution that makes technology more accessible. Manusify provides users with a reliable, empowering experience.
What we learned
We deepened our understanding of computer vision, gesture recognition, and accessibility needs, realizing the importance of precise gesture sensitivity and user-friendly design.
What's next for Manusify
Future steps include refining gesture accuracy, adding customizable controls, and exploring broader accessibility features to make Manusify adaptable to various user needs.
Built With
- machine-learning
- mediapipe
- numpy
- opencv
- python
Log in or sign up for Devpost to join the conversation.