Inspiration

We talked to a company sponsor here at sunhacks who are working on a remote control, autonomous, tractor controller for automated farming. They have a lot of functionality, however there are still drawbacks. The tractors still have to be driven by humans for a limited time and for various reasons, often resulting in a waste of weeks, even months. We thought we could circumvent this problem by using VR Telepresence.

What it does

Essentially, what our prototype does, is simulates an environment in which a driver can sit at home, in their virtual tractor, turn the steering wheel and manipulate other tractor controls. Thus far, the functionality that we were able to simulate in this 36 hour build is a camera with rotation and elevation capabilities, giving the driver a wide field of vision while operating the machine.

How we built it

We mainly developed the prototype in Unity, and used Arduino to make the camera rig. A logitech webcam was attached to two servos, arranged in such a way to offer 180 degrees of vertical and horizontal freedom. An HTC Vive was used to visualize the VR environment that was created in Unity. You can also export live data from the machine itself, directly to a database or csv file, then you may do as you please.

Challenges we ran into

We had a copious amount of challenges to overcome during development. One of the most difficult ones was how do we port the camera feed through to the virtual environment in unity? We ended up using a texture in Unity, and fed a video feed directly from the webcam to Unity, which was possible because Unity could detect the USB device. Another challenge was trying to translate movements in VR to the servos in real life. The creative solution we came up with was to use something called serial output from Unity to Arduino, which we were able to use in the Arduino program to convert rotation angles. We also had trouble automatically adding data to our server without manually inputting it.

Accomplishments that we're proud of

We are proud that it worked. The camera rig moved according to the headset's rotational movement in space, and the camera feed was successfully fed through to the textures.

What we learned

We learned how to port video feed into unity, how to program a leap motion to recognize a fist from an open hand. We learned how to make servos in real life move according to movements in VR.

What's next for Smo0Th Operation

We are open to licensing our software out to automated agriculture companies, or we can work on R&D for said companies wishing to develop their own VR environments. For more information,contact brendanbsaliba@gmail.com.

Share this project:
×

Updates