The App (Screenshot)
Alpha Design of Control System
Queued Movements in VR
All the hardware in a photo
We love Sumobots. The concept of autonomous robots battling it out in an arena is already a great hack - but we wanted to supercharge it! We wanted to transform a SumoBot into the best thing it could be - in both looks and purpose. VR allows us to modify the way it looks without physical limitations. By building a 3D model around an existing object, we can track the way it interacts with the surrounding environment, adding a level of depth to VR that hasn't been done before.
What it does
The HTC Vive Base Stations track the position of the SumoBot in realtime. It's position is mapped to a coordinate based system and relayed via MQTT to other applications in realtime. The open-endedness of this allows for communication with any platform. In this project, information is being passed through Node-Red to a React-Native app as well as Unity. By putting on the Vive Headset on, we can view the Bot in a virtual environment. This allows us to add a skin to the bot and make it look as awesome as your imagination allows! It currently looks like the BB8 Robot from Star Wars.
How we built it
We started by programming the SumoBot's Arduino to accept movement commands through Serial. These commands were sent to it by a Raspberry Pi. The Raspberry Pi can receive commands from any device through MQTT . We currently send commands through an Android app built in React Native. Commands can also be sent through the Node Red Web UI or any other device.
Tracking the Bot's movement is done using an HTC Vive Controller which is attached to the Bot. The movements are sent to our Unity Game, which then renders a beautiful natural environment for it. It also renders a model around it, which is completely customizable! We picked BB8 as its our favorite robot, but any robot skin is up for grabs.
The Android is built in React Native. An arena that models the playspace of the robot was built. A representation of the robot is used to display its position on the grid. Movement commands can be queued up and sent via MQTT to the Orange Pi, which acts as a communication server.
The robot can be controlled through multiple ways - One way is through the Android app and through the HTC Vive, where the user can plan a path and then see it happening through the Vive headset and in real life. Another way is using a game controller to send directional commands to the MQTT broker, which then relays it to the robot. Directional commands can also be injected directly through the Node Red interface.
Challenges we ran into
Intercommunication between the many platforms we used was extremely tricky to get perfect, especially considering the complexity of the network. Each platform has its own quirks encountered while setting up MQTT. Ensuring reliable communication was another challenge.
We began by using Unity to make the Android app, but it didn't go as expected. Due to numerous difficulties with the Unity interface, we moved development to React Native.
Accomplishments that we're proud of
As a team, we picked aspects that catered to each team member's area of expertise. The project was divided into modular portions which made the overall task less daunting to accomplish.
Threading it all together was another mammoth task due to the variety of frameworks used.