Inspiration

Our team comes from a heavy robotics background. Each of us spent all four years of high school heavily involved in our high school FIRST Robotics Competition (FRC) teams. In FRC, high school teams are given a challenge, and six weeks to build and program a (usually 120 pound) robot from scratch to compete against other FRC teams. Each of us spent upwards of 300 hours each year in those six weeks alone leading the design and manufacturing of the robot, and another 300 hours during the rest of the year leading outreach efforts, and now mentor one such robotics team of like-minded students.. We eat, drink, breathe, and live robots.

Naturally, we have been heavily inspired by the venerated Battlebots series and its counterparts. In the past year, the Battlebots series was given a fantastic reboot, and brought memories of our childhood dreams of robot fighting flooding back.

Currently, remotely controlled robots are controlled from the sidelines. We believed that everyone should experience robot battles in the most immersive way possible, through the Oculus Rift.

What it does

Paper Cutz puts the cathartic joy of robot violence inches from your face. Two USB webcams are mounted on a custom pan-tilt mount. This module is mounted to an inexpensive chassis, and controlled by an Arduino.

The images from the webcams are rendered onto each eye of the Oculus Rift, giving the user fully stereoscopic vision from the robots point of view. In addition, the head tracking data is sent to the Arduino over serial, and the Arduino follows head movements with minimal latency. This quick response time results in a system that is very natural to use.

Lastly, joystick axes from an Xbox 360 are used to drive the robot over the same serial interface.

The first thing users do when they don the system is look up at themselves. A common result is a slight giggle.

How I built it

The chassis and structural components are built out of the most inexpensive materials we could find. The motors, wheels, and baseplate were a total of $10 per bot. The servo motors for the pan tilt were mounted with simple VHB, and the cameras mounted to the pan-tilt module through the inventive use of NXT legos and zip ties.

The drive system is written as an “Arcade Drive,” where throttle and steering controls being used to create an output for the “Tank Drive” or differential drive. This is a simple feature added to enhance the user experience as driving a robot built as a tank drive takes extra finesse and is rather unintuitive without controls modifications. The VR version of the robot is driven with an XBox controller, while the pure control version is driven with a Wiimote controller.

On the user side things get more complex. The codebase here is all in c#. The interface that handles changing head tracking data into sendable data takes the orientation from the centerEye VRNode, remaps the values given by the GetLocalRotation function to values between 0 and 180 for the servo motors. A separate class then grabs this data and encodes it into a ByteBuffer to which the drive controls are added and sent.

The serial communication is kept to minimal bandwidth, with the entirety of the control system data sent via 4 bytes, one for each servo, and two for the drive system.

Lastly, webcams images are projected onto planes as WebcamTextures and placed in front of each eye to produce the stereoscopic image. As users wear the rift differently, a callibration function is added, where the image can be slightly shifted using the arrow keys to make the two images match.

Challenges I ran into

The first involves the Oculus Rift itself. The Oculus Rift runtime technically does not support optimus enabled graphics cards, due to the fact that the external hdmi port is hardwired to the Intel integrated graphics. As my team did not have a desktop with us, we had to download special Nvidia drivers from our own OEMs that allowed the Rift to work.

Later came the issue of rendering the images from the robot in the same place in front of the user’s eyes, while still acquiring rotation data. Our original plan was to take the prefab code, and hack away and disable the actual rotation of the OVRCameraRig. This would allow us to place the planes in static locations and move the individual eyes apart. This turned out to, unfortunately, not be possible, as the code that actually rotated the prefabs was too integrated into Unity to change. Our workaround was to attach the planes to the cameras themselves, scale them down, and place them extremely close to the eyes. In order for them to remain visible, we had to disable clipping for the camera.

Lastly, the original code that sent data to the Arduino took the ByteArray that held all of the controls data, encoded it to UTF8, and sent it via serial. For a reason we plan to ascertain after the hackathon, this changes the byte values in the string, causing the data to be incorrect for values above 127. After a difficult diagnosis, we switched to a function that simply sent the array raw.

Another big challenge we had was with actually controlling the robots. At first, we planned on using Wiimotes, over bluetooth, to communicate and drive the robots. We first tried to write our own protocol in Python to communicate with the Wiimotes, using a bluetooth library called Pybluez. After much struggle, we realized that Pybluez only supported RFCOMM bluetooth communications, while the Wiimotes used L2CAP. We moved onto another library called Wiiuse, which used another C library with the same name. We were able to reliably get gyro and accelerometer readings, but not nunchuck or button output. After struggling with this, we looked at other solutions in the Processing, Java, and C++ languages. Most libraries for the Wiimotes seemed to be outdated and not usable with current bluetooth libraries in those languages. We ended up using a program called FreePIE (Free Programable Input Regulator). This had native support for wiimotes, but is very restrictive on what can be done in this language. A .NET serial communication library was found, and we can now drive our robot using a Wiimote.

Accomplishments that I'm proud of

When we set out to make this project, we set a base goal: Simply make the robots work with the rift. As the day progressed, with issues ranging from “This byte changed for no visible reason” to “Unity crashes when I attempt to run my code.” we started losing hope that this would be a project that would reach completion. As the day progressed, and with the gracious and generous help of the representatives from Unity, functionality issues were fixed and we were able to, by the beginning of Sunday, complete all of our goals for functionality goals, and were even able to demo the system to some representatives and other teams.

What I learned

Technically, we gained more literacy in Unity3D, with Arduinos, and in Freepie than we would over a month of free time developement at home.

There’s more to learn than technical knowledge however.

We learned over this weekend the true value of experience. Without the presence of the extremely knowledgeable Unity representatives, we would have fain gotten the rift to render images, let alone finalize the control system.

We also learned the value of proper documentation. There was little documentation when working with controllers, because these controllers were from a few years ago. Documentation seems to be a luxury with retired code, and we need to document everything better for the longevity of a product.

What's next for Paper Cutz

Next we hope to get our systems wireless. This would mean a wider range of possible attachments (IE: spinners, giant arms and, more area a robot can be controlled in.) After that, making APIs to connect with each robot over the internet would increase how many people can use our platform and could hopefully bring in a new age of competitive robot combat. Because we hope to increase the number of users who are able to experience VR technology we want to add Google Cardboard integration because it is a very cheap VR solution.

Built With

Share this project:

Updates