I had wanted do make something that could integrate human users controlling a drone over long distances in a different way. Currently FPV(First Person View) drones only have one stationary camera, which the viewer cannot do much with. I decided to make a better First Person experience with the viewer/user, so I this.I had received a Google Cardboard headset and had never used it, so I got a bunch of junk together and made a car like this.

What it does

There is an Intel Edison board which is connected to Azure services along with an amazon fire phone. The sensor data(orientation, accelerometer, gyroscope) from the phone is then fed into the Edison via Azure's IoT Suite(we had issues with the fire phone), which in turn steers the car and moves the camera's servos to emulate the user's action. In doing that, the user can control a device many miles away and maintain good communication. I think devices like these can really help in situations where the human body cannot withstand the conditions, and where robot problem solving just cant solve a certain job. It can also help with communication over long distances, such as the ISS or even another planet! Such devices are currently being worked on by NASA, for improving communication with astronauts in outer space.

How we built it

We got 2 servos and glued them together to a webcam. We then connected the servos and 2 motors to drive the car to the Edison. Once we were finished with that, we decided to have some fun and added some spoilers to the car. One of the motors died, so we spent 4 hours trying to fix it. Thanks to Capital One's Business Card API :) , we improved the overall security and integrity of our robot's structure

Challenges we ran into

Getting the fire phone to run Android apps, in order to make the device more accessible, was a huge hassle, along with trying to get the Edison to communicate with the phone itself, and to stream the feed directly.

Accomplishments that I'm proud of

Fixing the device many times. The old motors died, so I had to scavenge for parts everywhere. Eventually, I found a motor too big for the car, so I had to modify the gearbox to fit the car and to keep the speed of the sides balanced. We didn't have anything to mount anything on, so I had to improvise, and now thanks to me, a good portion of the robot is held together by business cards given away at the hackathon.

What I learned

Do not try to put motors directly to AC Voltage, my friend got a nasty shock this weekend because of it.When in doubt, business cards and hot glue can solve a good number of problems.

What's next for VRCar

We are planning on adding autonomous Driving, facial recognition, smoothing of images and motion, higher camera quality, faster refresh rates, and an overall better interface for our user.

Share this project: