Inspiration
That one scene in Naruto where faster handsigns win the fight
What it does
- Track gestures from each meeting's participant
- Participants need to follow the prompted gesture
- Whoever gestures faster wins
How we built it
Using OpenCV paired with Google's Mediapipes, we were able to stream user's webcam feed along with their gesture data to an ExpressJS backend. The server then forwards the processed data to all clients on the frontend, by intercepting Pexip's video frame API.
Challenges we ran into
- Installing the right dependencies across all 4 of our devices
- Merging the processed gesture feeds with Pexip meetings
- Implementing game logic into the Pexip meetings
Accomplishments that we're proud of
- Connecting processed gesture feeds over LAN without deployment on separate devices
- Implementing the Pexip API to make proper meeting rooms
What we learned
- Choose the tech on all ends more wisely. Using Python across the whole project is more practical in hindsight (Flask for backend, React Python for frontend, Python for gesture tracking scripts)
What's next for Gesture Battle
- Incorporating our own custom gestures
- Longer sequence of gestures
- Logic for more than 2 players
Built With
- express.js
- mediapipes
- opencv
- pexip
- python
- react
- socket.io
Log in or sign up for Devpost to join the conversation.