Inspiration

A sudden spark partway through the hackathon, drumming out music on the table made us stumble upon this idea.

What it does

It uses computer vision (OpenCV) to allow a user to play on a virtual drum set, using two pens (with pink and green caps) as drum sticks

How we built it

We used OpenCV and Python, and tracked the specific range of colors of the pen caps to track their motion across the screen. We used .ogg sound files and pygame to play the appropriate beats whenever the drum sticks collided with the image of drums on screen.

Challenges we ran into

Too many to count. From formatting the sound files and getting them to work, to trimming them to make it sound precise, to getting our front-end and back-end working together, we faced a host of problems at every step of the way.

Accomplishments that we're proud of

Completing this project fully, a first for us as a team in hackathons, especially considering that the idea was so wild and alien to us when we first thought of it that we never thought we would actually finish it in time.

What we learned

Working with computer vision, and using flask for the front-end. Several smaller learnings on the internal workings of commonly used libraries, and a whole lot of experience with creating self-contained apps.

What's next for Air Drums

Mac compatibility, which remains a challenge due to lack of specific windows libraries used in this project. Furthermore, improving efficiency of tracking and minimizing background distractions. Finally, an improved GUI that allows the user to track just about anything as their drum sticks, instead of specific colors.

Built With

Share this project:

Updates