Inspiration

As the pandemic began, living alone and trying to quarantine was not easy. Like many of us, I have turned to video games to pass the time. It has been at least a decade since I have spent a serious amount of time playing video games, but I had nothing but time.

Having tried a few games I stumbled upon RYSE, a game where you can run around as roman general Marius fighting the barbarians with sword and shield. I dove sword first into this make-believe world of blood and sand on the silicone of my laptop’s GPU.

Slowly strolling the immaculately designed Roman landscape spending more time panning around than fighting I got a bit concerned about how much time during this pandemic I spend sitting on the couch neglecting any exercise. The character that I was controlling was swinging for the fences, pirouetting in armor, destroying the barbarians but all I had to do to make him swing the sword was to press the left mouse button. I figured I can fulfill my daily exercise needs if I just swing my hands as much as he swings the shield. The next few thoughts have given me countless hours of work and joy.

If I could only control the actions of Marius with my movement instead of clicking the buttons. I’m not sure how to get these games on anything else but a laptop and who knows if this stuff is Wii or VR compatible, so can’t just buy my way out of this problem. Connecting some sort of motion sensors from my body to my laptop seemed like something I had a lot of work on, but my laptop has a camera, and it can recognize my movement and press the appropriate buttons.

Generally speaking, the goal is to improve how we interact with the computer. We started with punch cards, then got the keyboard, mouse, and touchscreen. Video games are a natural start to practice innovation, and they have attempted to change the gameplay with joysticks, Wii, and now VR sets. But most of those innovations require specialized equipment. Using image recognition to turn the body into a controller can become a simple, scalable, cheap method of interacting with our electronics, and in the present case decimating the barbarians.

What it does

In this video, I demo GesturePlay Ryse — a body gesture control system that lets me play the game Ryse: Son of Rome using just my movements. The camera sees my gestures, and with OpenCV + MediaPipe, they’re translated into keystrokes that control the game.

What's really exciting is that I’ve now evolved this project into a universal gesture controller! All key bindings are stored in a CSV file, which means the Python code can be turned into an executable — and all you need to do is update the CSV to play different games. No code changes necessary.

How I built it

I started with training my own models but later discovered that Goggle Mediapipe recognizes key points on the body. Using that code i created the buttons on the screen when I touch them I trigger the code to push the button.

Challenges I ran into

Accomplishments that I'm proud of

What I learned

What's next for GesturePlay Ryse

I'm working on testing this code on other games. Improving the mouse rotations.

Built With

Share this project:

Updates