Inspiration

With inspiration from all the Sci-Fi Movies, and taking in consideration the constant development of AR/VR in the world. We wanted to make a game in which gestures and body parts movement can affect the game objects in different ways.

What it does

We build a game in which the user has to control a 3D Modeled drone using his hand. Currently the altitude of the drone can be controlled by the upward and downward movement of the drone.

How we built it

Using an Arduino, a proximity sensor, Unity and C# we managed to generate output from the Arduino, stating an approximate distance from the player's hand, we plugged that into a C# script and used the data to move the 3D Model inside the game.

Challenges we ran into

The biggest challenge that we faced was linearizing the data sent from the sensor. Because the sensor is not precise, it was hard to linearize the data such that the Drone moves smoothly.
We tried to create a multiplayer version of the game, so two different players can play it from two Unity, due to some Unity Networking problems , we are still working on this.

Accomplishments that we're proud of

Managed to make a hand controlled game, learned more Unity , game works in a Pre-Bata version.

What we learned

Learned about Arduino and about proximity sensors.

What's next for Proximity based controller for Unity

We want to connect gyroscopes, gestures sensors to the game so it gets more useful input that can be used to control the drone.

Built With

Share this project:

Updates