Inspiration
The entire team has been pursuing computer engineering for a few years and wanted to develop a project that leveraged computer vision with robotic movement. With the HelperBot we were able to employ both of these techniques and allow users to interface via an interactive Flask Webserver.
What it does
The HelperBot rover was developed to assist individuals not capable of carrying their belongings around by following them with their belongings by recognizing a high-visibility vest. This rover, with robust tracks, is intended to operate inside, outside, and in otherwise 'unfriendly' environments.
This HelperBot integrates both manual and automatic tracking functionality via keyboard WASD input, keyboard arrow key input, or website button presses whereas automatic tracking leverages computer vision to track a high-visibility vest.
How we built it
This rover was built using the following hardware:
a Rover 5 Chassis, the Raspberry Pi 5, the HR-S04 ultrasonic sensor, a 9.6 V 2000mAh battery, L298N H-bridge, and an Anker portable charger.
This rover was also built using the following software:
a Flask webserver was deployed to allow users to interact with the rover and see the camera output, it leverages TailScale to link IP addresses and allow up to ten devices to control the rover on a VPN/peer-to-peer mesh network. The project was coded in Python, HTML, and CSS and uses the gpiozero ibrary to control the ultrasonic sensor and motors, as well as, the opencv library to integrate vest tracking. Git was deployed for version control.

Log in or sign up for Devpost to join the conversation.