Inspiration

Since Rosie the Maid for appear on Television in the 1960's robotics has made significant advancements, but more than 50 years on we don't yet have Rosie. We only have Roomba.

This team wanted to build something that could be welcomed into households alongside the Roomba, and provide an particular benefit to the user that could not be fulfilled by the monopolising smartphone.

With limited time and money, we knew that the only advantages our robot could possess would be in mobility and automation. This gave us our audience; people of impaired mobility (elderly, disabled, injured). But how to could it help these people, would task could our humble robot perform for them.

What it does

Our robot plays with and exercises a pet dog.

Built with a hacked together drinking cup, nerf gun, screw and servo solution, our robot has the ability to store and fire a tennis ball for a dog to retrieve. It goes about 1.5 metres!...

The robot also has the ability to drive and turn, but it is a little over encumbered with weight.

{Not Yet Integrated With Robot} On board the robot's Nvidia Jetson brain, it is recognising tennis balls and tracking their movement.

{Not yet inmplemented} The robot was also supposed to interface with Google maps in order to navigate itself to parks or around retirement/disability villages.

How I built it

The robot was build pretty much from Scratch (no kit) and many of the elements in it were 'hacked together', being unable to access a store when it became extra parts were needed. The Driver chip was build on-site.

The Computer Vision tennis ball recognition was built by compiling (with difficulty unfortunately) on the Nvidia Jetson the CUDA Toolkit and the OpenCV library. No machine-learning techniques were used for recognition. Instead colour and contour detection identify tennis balls.

Challenges I ran into

Lack of access to Hardware Parts as-needed put pressure on the hardware guys to creatively cobble some alternate solution or 'mock-part'.

Ethernet, which was providing internet to 4 of our development devices, was cut off at 9pm, and much time was spent trying to regain access to the internet for the Nvidia Jetson. Eventually we had to leave the building and go elsewhere for Ethernet.

Installing and compiling unfamiliar, legacy libraries on Linux is really hard when you suck at Linux, but I suck a little less now. We spent over 5 hours compiling and re-compiling eugh.

Trying to program Computer Vision code when you have been even seen it before it hard.

Accomplishments that I'm proud of

The group members worked for an average of 23.5 hours of the 24 hours, and were constantly troubleshooting things in a problem space they were unfamiliar or moderately familiar with.

No chips fried in the 24 hours.

What I learned

Your development environment should be running, and familiar to you before you rock up. 12PM - 12PM is not enough time for serious hardware builds. The Nvidia Jetson does not play nice with WiFi, and that sucks when Lab14 turns ethernet off at 9pm.

3rd-Party

OpenCV for Python (cv2) OpenCV for Tegra CUDA 6.5 Toolkit imutils NumPy Nvida Jetson Raspberry Pi 3

Credit To: http://elinux.org/Jetson/Tutorials/OpenCV Credit To: http://pleasingsoftware.blogspot.com.au/2014/06/identifying-balloons-using-computer.html Credit To: http://www.pyimagesearch.com/2015/09/14/ball-tracking-with-opencv/

Built With

Share this project:
×

Updates