Inspiration

In current VR setups you cannot see your body, or interact with the VR environment around you without the use of controllers. Allowing users to see their bodies in VR will remove disorientation and permit more natural interactions with the surrounding environment.

What it does

Using the Kinect Sensor you can see your body in a VR environment. We demo the capabilities and the possible applications of the technology in a 2 mode soccer game. In one mode you kick a ball and try to score a goal, and in the other mode you play goalie.

How we built it

To build this we used Unity and the Microsoft SDK for the Kinect v2. The Microsoft SDK for Kinect v2 allowed us to pull in the wire-frames of people walking into the scenes. We then calibrated the body's location to match up with the location of the headset so the user is able to see their body. When they move their arms they can see their arms move in the VR environment. From there we attached objects to the key points of the hands and the feet of the user and programmed controls to them. These objects allow the user to interact with the world around them such as picking things up, catching objects flying through the air, kicking a soccer ball, etc.

Challenges we ran into

Because of the complexities of using the wire-frame from the Kinect and attaching objects to it we had to write our own code for the physics of the objects attached to the joints of the wire-frame to interact with the environment around it. We could not set the speeds of the joints on the wire-frame because they were synced with the player, instead we had to measure them and use these measured speeds to set the speeds of objects the collide with.

Accomplishments that we're proud of

This project is one of the coolest things that any of our group members has ever made. Working with complex libraries and systems like unity as well as getting so many moving pieces to work together is a tremendous feat. We are very happy with how this project turned out, and find that not only is it super cool, it is also super fun.

What we learned

Brett and Isaac learned a lot about programming Unity with C# for VR, and all of us learned a lot about working with a Kinect sensor.

What's next for VR Body Motion Soccer

This project opens a lot of doors. It provides a proof of concept and a path for using Kinects and other sensors that produce point cloud data to not only map a room, but to be able to control a VR environment with only a single sensor and no controllers. Full body control of VR environments is now possible. It is also possible to have people that are not wearing the headset interact with the environment, and this is something we would be interested in developing in the future.

Built With

Share this project:
×

Updates