Inspiration

Our team aimed on doing a project with a mix of software and hardware, where we can have fun while also applying our concept to help people.

What it does

A turret which targets and shoots people based on the user pointing towards them. The website set up allows users to manually control the turret as well instead of the tracking camera.

How we built it

Our project runs on a raspberry pi. For the machine learning we used OpenCV, C and Python. The model parts for the turret were 3D printed and assembled together.

Challenges we ran into

The main challenges we encountered were the 3D printers taking too much time to print the parts we needed and the motion. Also, the software we used to stream the webcam to the web interface was very outdated resulting in a lag with the camera.

Accomplishments that we're proud of

We managed to get our software to recognise different hand gestures and what action they're calling, like shoot. We also designed a 3D CAD model showing all the mechanisms of our hardware.

What we learned

In the future, we know to schedule 3D printing for earlier in the hackathon so it has time to finish for the deadline.

What's next for Gesture Turret

Perhaps completing the 3D build so it can function as it was meant to.

Share this project:

Updates