Azure Submission:


Create a gesture controlled robot which can change speed and direction using the Myo armband when in tele-operated mode. When in autonomous mode, the robot patrols the path mapped during tele-operated mode.


Drive Tele-operated with Myo Status display on Blinky Matrix Sensor Input (Range finder, Accelerometer) & back to base Camera input & back to base Drive an Autonomous path Drive the mapped path


Sentinel The GoonGuard Vanguard


Robot Platform: 4WD Chassis w/ Motors Range finder Motor Drivers (2) Raspberry Pi 2 Model B Arduino Uno ESP8266 Wi-Fi Chip

Base Control: Myo

Project Description:

The GoonGuard is the robot protector of all official goons. Created by the #GoonSquad, the GoonGuard can function as a teleoperated and/or autonomous patrol or reconnaissance robot.

Teleoperated Mode While in teleoperated mode, the robot is controlled using the Myo interface. During teleop, the robot can be controlled using various gestures to go forward, backward, turn left or right, or stop. Additionally, users can adjust the speed of their robot between 20% and 100% using the Myo armband. The GoonGuard is also equipped with a range sensor for large obstacle avoidance and a Blinky Matrix to identify which state the robot is in. During teleoperated mode, the path dictated by the user is saved as a “patrol route” onboard the Raspberry Pi.

Autonomous Mode In autonomous mode, the robot follows the “patrol route” determined by user in the teleoperated mode using the Myo armband.

Technical Difficulties

Blinky Matrix file size much too large Sparkcore unable to connect to Wi-Fi Raspberry Pi Motor Drivers, changing the wiring to work with Arduino 3D Printing failing repeatedly

Interfacing with Myo

Myo provides a C++ SDK for MS Windows and MAC OS X, sadly no linux ;(

The Myo SDK allows access to thalamic gestures: Fist Fingers Spread WaveIn Wave Out Double finger tap

Additionally we have access to the IMU data from the band, such as yaw, pitch and roll.

These inputs were mapped to the following outputs: Fist Stop the robot Fingers Spread Go forward or backwards WaveIn Go right forward or backwards Wave Out Go left forward or backwards Double finger tap Changes mode from forward to backwards & vice versa Pitch value given by Myo from 0 to 18. 0 to 9 (arm down to arm parallel to ground) Robot is stopped 10 to 18 (arm parallel to ground to arm up) Pitch was mapped to the PWM value (0 to 255) Four speeds implemented, depending on the pitch 20% PWM 46.67% PWM 73.33% PWM 100% PWM Modifications were made to myo listener for faster refresh rate. The C++ program calls a python script that is launched as a child process and takes input the command to post to the ESP The python is a simple urllib2 script that does a HTTP GET request to the server for that command. The ESP (lua script) receives the request which it parses to get the command and outputs it on it serial port.

Implementation Stack:

(Insert Chart Here)

Lua script running on the ESP that parses the GET request and outputs the command on its serial.

Since the spark core was not working, we ending up deploying a web server on a ESP8626


Piping myo data to the robot: To pipe data to the robot we initially selected the spark core, that would allows us to pull data from spark cloud. Where the sensor data would be pushed from the laptop on which the SDK APP is running.

Quite some time was spent on making the spark core work: Spark Cores were already owned so we couldn’t register them to the spark cloud. Solution: Flash the cores locally over usb. Spark Core would not connect to wifi due to non-support for WPA2-Enterprise by the TI Wifi chip. Solutions tried: Windows wifi hotspot -- Spark Core did not detect. Additional Wifi dongle hotspot -- Spark Core did not detect. Wifi hotspot in linux on vm with the wifi dongle -- Spark Core did not detect. iPhone hotspot -- Spark Core did not detect. Interfacing the ESP to the C++ myo app HTTP libraries for C++ and overtly complicated, especially those provided by Microsoft since we are tied to the windows by myo. But python has very robust standard networking library. The python script takes the command to send as an argument. Python lib urllib2 that opens the url at the Whenever the myo app needs to send a command it launches the script as a child process with the command as the input argument void sendCommandtoEsp(std::string command_input) { std::string arg = command_input; std::string command = "python "; std::string systemcommnad = command + arg; system(systemcommnad.c_str()); // launch python script } Autonomous path Autonomous path was left unimplimented, due to integration issues with esp and the raspi, line feeds are being appended propoerly

CAD & 3D Printing

In order to support the hardware of our robot and give it a nice visual aspect, we created a CAD model for a base to place on top of the robot.

A first iteration was sent for 3D printing, unfortunately it failed as the printing head was hitting on the sides of the machine because our piece was too close to the allowed limits for the printer.

Therefore, it was split up in two parts which we would glue together. The first half worked, the second did not. The 3D printer could not properly print it. We tried that last piece another time, still didn’t work. We tried a third time with more support, still did not work.

Thus we ran out of time, so another more viable solution was chosen: We created a base out of wood and styrofoam, which we screwed to the platform, plus a support tower for the blinky matrix made out of popsicle sticks. Each piece of hardware is enclosed is a “fence” of styrofoam to hold it into place and organize.

Blinky Matrix

The blinky matrix is meant to be an indicator of the state of the robot, whether it’s going forward, right, left, stopped, etc. The blinky matrix pre included an arduino, which had to be desoldered to connect the blinky to the arduino we are using on the robot, in order to synchronize the states. Programming the arduino to run patterns on the blinky natrix was more challenging than expected, because each animation, according to tutorials and examples, was initialized as an object from class Animation, which took around 40% of the space allocated to global variables. Having around 10 different animations, that seriously impaired our original idea. We ended up using a pointer dynamically reallocating the Animation with a new set of data, using only the space of one object.

Share this project: