Like most university students, we understand and experience the turbulence that comes with relocating every 4 months due to coop sequences while keeping personal spendings to a minimum. It is essential for students to be able to obtain an affordable and effortless way to release their stresses and have hobbies during these pressing times of student life.
What it does
AirDrum uses computer vision to mirror the standard drum set without the need for heavy equipment, high costs and is accessible in any environment.
How we built it
We used python (NumPy, OpenCV, MatPlotLib, PyGame, WinSound) to build the entire project.
Challenges we ran into
The documentation for OpenCV is less robust than what we wanted, which lead to a lot of deep dives on Stack Overflow.
Accomplishments that we're proud of
We're really happy that we managed to actually get something done.
What we learned
It was our first time ever trying to do anything with OpenCV, so we learned a lot about the library, and how it works in conjunction with NumPy.
What's next for AirDrums
The next step for AirDrums is to add more functionality, allowing the user to have more freedom with choosing which drums parts they would like and to be able to save beats created by the user. We also envision a guitar hero type mode where users could try to play the drum part of a song or two. We could also expand to different instruments.