Inspiration

We wanted to use imaging to steer a vehicle.

What it does

As we move a handheld object in front of a camera, the vehicle moves itself in the direction of the handheld object's movement.

How we built it

Our original idea was to make a moving trash can, which adjusts its location to 'catch' trash thrown in its direction. With this goal, we worked on three parts:

  • The code which would allow tracking of a 'piece of trash' and move the wheel motors accordingly
  • The electronic component which involved connecting the raspberry pi, arduino, and motors
  • The wooden base upon which the trash can would rest and electronics/motors would be mounted on

During the process, we decided to change the device to something which is guided by the movement of the object, rather than trying to catch the object. This also gave it applications beyond just a 'trash can.'

We used the opencv library to write code that would recognize the movement of a red ball in front of a laptop camera, and move the motors accordingly. We connected the motors to an electronic circuit comprising the raspberry pi, arduino, and step-up motor driver. We designed a wooden base and mounts, and attached the electronics to these.

Challenges we ran into

Figuring out tracking of the object was a challenge, as was controlling high-speed motors using the raspberry pi. At some times we had to think around using pieces of equipment that weren't available.

Accomplishments that we're proud of

Coming to completion of a project after trying multiple pathways, and successfully interfacing software and hardware.

What we learned

We learned about using electronics, especially interfacing the raspberry pi with arduino.

What's next for TracMan

We are trying to advance TracMan from moving in response to an object's movement, to moving in response to the user's hand movement.

Share this project:

Updates