Project Motivation:

This project aims to develop a fleet of coordinated Balance Bots, communicating with the central server to behave in a certain fashion. The project is inspired from the Balance Bot Lab of ESE 519, where a balance bot is being programmed and controlled externally to behave in a desired manner. The learning outcomes of the lab can now be implemented to coordinate multiple balance bots simultaneously. We have utilized the Vicon Motion Tracking System at Grasp Lab and Perch at Pennovation to obtain a coordinate system of the space where we desire to coordinate the bots. The application of such coordinated multi-bot control has humongous potential in the entertainment, logistics and automation based real-time environments.

Baseline Goal 1:

The team aimed and achieved the following goals in the first 2 weeks:

Week 1:

  • Study and understand the operation of the Vicon Motion Tracking System.
  • Retrieving important parameters from the system that govern the motion of the bot in space.
  • Position Triangulation of a bot in the Vicon space.

Week 2:

  • Remote controlled Leader Bot, controlled via ROS Client-Server interaction between our laptop and the leader.
  • Understanding position parameter variation from the tracking system input as the bot moves in the space.

Baseline Goal 2:

Week 3: Second bot introduction

  • Second bot leader-follower logic implementation controlled based on the continuously refreshed coordinates fetched from the motion tracking system and the desired destination coordinates.

Reach Goal:

Week 4:

  • Implementing multi-bot train formation (Implemented with 3 bots).
  • Dynamically introducing multiple bots in the space. Each new bot introduced followed the previously added bot, thus forming a train with the first bot being the only bot being controlled.
  • Planned and was not executed: Other patterns and robo

Hardware effort:

  • The balance bots were collected from the experiment 5 conducted in ESE 519. Reflective markers were placed on the bots so that it can be detected by the motion tracking cameras. A set of 4 makers were placed asymmetrically on a bot for it to be detected as an unique object in the tracking system. The tracking environment was restricted to areas where the cameras were function and so the bot positions were seamlessly processed without errors.
  • The bots originally came with Debian Linux OS platform for which existing ROS packages are rudimentary and therefore the bots were booted with an external memory card (due to size constraints on the beaglebone) flashed with Ubuntu 16.04+ROS kinetic package to establish a ROS workspace on the bots.

Software effort:

The Software domain essentially depended upon three aspects

  • Vicon tracking environment
  • Master System (User Laptop)
  • Slave System (Bots)

The Vicon tracking environment uses its predefined and implemented ROS packages and the nodes running under these packages convey various information such as object location, orientation and velocities. The project used object location and orientation as its primary input for processing actions for the bots. The project implemented the logic for 3 bots and thererfore 3 data sets were worked upon.

Further a master system, where the ROS_MASTER_URI was established and exported to the bots, contained the major logic processing nodes for the projects. Please refer code structure for the implemented nodes.

The logic for a bot to follow its master had the following elements:

  • Linear: The botguider node continuous checks the distance between the follower and master bot using the location details obtained from the two bots. The follower keeps moving until the follower enters the collision avoidance radius, the follower is commanded to stop.
  • Rotational: The yaw of the bot is determined from quaternion calculations. The angle between the bots yaw and the vector formed by joining both the bots is calulcated by the dot product between the two vectors. This angle determined when the follower is aligned to the master when rotated. The direction of the rotation was further given by the cross-product between the aforesaid vectors. This helped the follower align quickly and enabled it to point exactly to the master bot.

Code Structure:

** Please refer system structure diagram in the shared images **

VICON PACKAGE (mocap_vicon):

The Vicon package enables the use of the location and orientation for the objects selected to be tracked.

vicon.launch: The Vicon launch node which uses the ros environment to push out coordinates for the objects selected in the Vicon motion tracking space. Launch file specifically selects which objects to analyze. Link for github vicon repo:


Relevant Codes in the Master URI computer to control the multi-bot operation:

  • (CMakelist.txt lists out the executable files to be made when compiled and dependencies to be considered for the executable files)
  • (Package.xml) Package info for "balance bot" package and the dependencies required for the executable files in the package.

  • command.cpp - Contains code for a ROS client node publishing key hits on the computer to the Leader/Master bot.

  • botguider.cpp - ROS location and orientation subscriber and client node. Guides the Second bot in line. Inputs from the Vicon environent - Master bot's location and orientation , Second bot's location and orientation. Output - Instructions to move towards the first bot, actively checking the collision radius and direction pointing to the leader.

  • botguider2.cpp - ROS location and orientation subscriber and client node. Guides the Third bot in line. Inputs from the Vicon environent - Second bot's location and orientation , Third bot's location and orientation. Output - Instructions to move towards the second bot, actively checking the collision radius and direction pointing to the second bot.

  • obstacle.cpp - ROS client node for obstacle detection (premature: needs fixes)

BOT PACKAGE ("edumip_balance_ros" Package on all three bots)

Relevant Codes in the bot's catkin workspaces to actuate control instructions:

  • edumip_balance_ros_keyboard.cpp - Leader Bot ROS Server code communicating to the command node through "keyhit_info" service and keycomm.srv service type
  • follow_1.cpp - Bot 1 ROS server code communicating to botguider.cpp through "bot1follower" service and instruction.srv service type
  • follow_2.cpp - Bot 2 ROS server code communicating to botguider2.cpp through "bot2follower" service and instruction.srv service type


  • Understanding the limitations of Cluster SSH to control multiple bots. Difficult to extract unique behavior from each bot.
  • Creating unique patterns with minimum reflective markers due to less utilizable surface area on the bots.
  • Software compatibility issues with ROS and Debian Linux.
  • Code logic debugging to avoid repetition of command key hit command updation.
  • Code logic debugging for angle with which the bot had to rotate to align itself with the master bot.

Project Videos:


Github Reference:

Built With

  • balance-bot
  • beaglebone
  • c++
  • embedded-systems
  • linux
  • motion-capture-system
  • ros
  • ubuntu
  • vicon
Share this project: