Inspiration
The idea for Sparsh Mukthi was born from observing the constant touch-dependency in hospitals, VR classrooms, and shared public environments. We noticed that in sensitive areas, such as ICUs or clean labs, touch-based interfaces posed hygiene risks, disrupted immersion, and hindered accessibility—especially for elderly users or professionals with gloves. We envisioned a world where interaction with technology could be seamless, touch-free, and universally accessible.
What it does
Sparsh Mukthi is a touchless system control solution that allows users to navigate and operate digital interfaces using only voice commands and hand gestures. It removes the need for physical contact, promoting hygiene, enhancing accessibility, and enabling immersive control in VR and professional setups. The system can work across devices—PCs, laptops, and phones—and is being extended to VR platforms for gamers and immersive workers.
How we built it
We used Python for core functionality, integrating libraries such as:
speech_recognitionandpyttsx3for voice commandscv2(OpenCV) andmediapipefor gesture detectiontkinterfor interface prototyping
We built and tested on macOS and are currently optimizing it for mobile platforms and VR environments. Firebase is used for login, usage tracking, and referral management.
Challenges we ran into
- Making gesture recognition consistent across different lighting conditions and camera qualities.
- Ensuring voice commands remained accurate even in noisy environments.
- Cross-platform compatibility (especially adapting for iOS and Android environments).
- Designing a user-friendly yet professional website with download and tutorial options.
Accomplishments that we're proud of
- Successfully built a working prototype that supports voice and gesture controls on macOS.
- Designed a website to showcase and distribute the app with a download option and demo video.
- Developed a Firebase-integrated referral and usage-based access system.
- Initiated VR model development for a touchless gaming experience.
What we learned
- Deepened our understanding of real-time computer vision and speech processing.
- Learned to integrate frontend with Firebase backend.
- Understood the user experience challenges for accessibility and hygiene in public systems.
- Gained insights into product design, branding, and business strategy.
What's next for Sparsh Mukthi
- Complete development of mobile (iOS & Android) versions.
- Launch VR-compatible models for gamers and immersive tech users.
- Add real-time contextual AI to understand user intent better.
- Partner with hospitals and educational institutions to pilot the solution in sensitive environments.
- Roll out tiered pricing with offers based on referral and usage to build an early user base.
Built With
- ai/ml
- flask
- iot
- javascript
- mediapipe
- opencv
- pyauotgui
- pygame
- pynput
- python
- sciket-learn
- vr
Log in or sign up for Devpost to join the conversation.