OUR LINK IS HERE: (YouTube was evil): https://drive.google.com/file/d/159L58M-eU8iuFcd_nyqSlTRKNxbiQfp1/view?usp=sharing

Inspiration

Wii Sports is a popular game on Wii that offers players the chance to play many sports with the option of competing with each other, though it comes with a cost and is exclusive to the Wii platform.

This exclusivity inspired us to develop CyberPong, with the goal of democratizing at-home sports enjoyment and ensuring fun for everyone. CyberPong emphasizes enhancing health and wellness through fun, interactive activities while building connections between players.

What it does

CyberPong is a new kind of 2-player ping pong game. It utilizes the webcam feeds from both players, who use hand motions to mimic hitting a virtual ball. Finger Pong then calculates the path (trajectory, acceleration, gravity) to simulate a game of ping pong.

How we built it

CyberPong is built using a variety of technologies and frameworks: HTML,CSS, JavaScript, Oimo.js, Svelte, MediaPipe, WebRTC, WebSockets.

  • HTML and CSS were used to create and organize the website.
  • JavaScript was the main programming language and was used to link every thing together.
  • Oimo.js was used for physics. It was used to model the ping pong ball's trajectory, acceleration, gravity, the sensibility of the force applied on it.
  • Svelte was the Javascript Framework used to make the chat rooms, the lobbies, and the rest of the frontend.
  • MediaPipe was used to detect both the face position and the hand position, which was then translated into gestures.
  • WebRTC was used to transmit the peer-to-peer video feedback.
  • WebSockets was used to transmit the ball position and the scores.

Challenges we ran into

There were numerous challenges.

  1. Oimo.js was an outdated library from 6 years ago and had no proper documentation, making us figure out how the physics engine worked by looking at their source code
  2. MediaPipe did not output bounding boxes and we had to work had to crop the image to show the face, and just the face.
  3. MediaPipe did not have any information concerning velocity of our hand movements, so we had to keep track of the positions and perform a sliding average of velocity to determine when a gesture was performed.

Accomplishments that we're proud of

We're proud to have incorporated the gestures from MediaPipe to Oimo.js and to have made a fully functioning frontend with WebRTC calls and chat features.

What we learned

We learned about Oimo.js, a physics engine in Javascript, MediaPipe, a face and hand detection library, and Svelte.

What's next for CyberPong

Polishing bugs, adding TURN servers to make WebRTC run on different networks and improving the speed of the detection algorithms.

Built With

Share this project:

Updates