We were inspired by the search and rescue competition to design a robotic vehicle. Once we started the hackathon we were introduced to the myo armband, which we felt would make a cool controller for our robot. Additionally first person view (FPV) would be an important feature as the robot might be out of the operators sight. The Oculus was available which we thought could be implemented into the project to create a VR driving experience. The user would have a first person VR experience of the robot while controlling it with "driving gestures" using the Myo armband.

What it does

The myo armband can classify several different hand gestures, which are fed into an arduino mega. The recognised gestures are then converted into directions that the robot should follow. These are passed onto the motor controller which drives the corresponding motors. Finally a camera mounted on the front of the vehicle provides a live FPV which is sent over skype to a control computer.

How we built it

We started by building the chassis of the robot and connecting the wheels. Next we programmed the motor controller to power either the left or right side motors. After this we calibrated the myo armband and developed arduino code that extracted the different gestures using Myorduino available on the Myo Market. The camera and live feed were set up via Skype. Finally all the parts were integrated on the chassis.

Challenges we ran into

Firstly we hoped to use an oculus rift to provide the FPV however, none of our computers met the minimum system requirements to run the oculus rift.

Additionally we were hoping to use a wifi module to communicate with the arduino over wifi. There were several problems with this approach. Firstly trying to update the firmware on the module proved far too problematic. Secondly we could not connect to the local wifi with the wifi module as we needed special administration permission.

We were lacking in hardware and had to use leftover bolts and wires from shop floors.

Lack of experience working with the new hardware and most programming languages meant a limited amount of features of the myo could be communicated to the Arduino due to inability to edit the source code of the Myoduino.

Accomplishments that we're proud of

Integrating the myo armband to control the movement of the vehicle while having no knowledge of the myo prior to the project.. Getting the microcontrollers to communicate with each other and making use of the limited hardware that was available

What we learned

Basics of developing for the myo. Understanding of the myoduino (interaction between Arduino and myo) How gesture extraction works. OpenCV is problematic on windows oculus rift has high system requirements How to get communications running between two micro controllers

What's next for Myown robot

To make the robot fully wireless and controllable via odroid. Add semi autonomy in the form of an ultrasonic sensor so the robot doesn't crash. VR FPV once we can control the robot wirelessly from a desktop computer Increase the complexity of gestures that the robot recognizes and removal of the delay between input and output to add advanced control.

Built With

Share this project: