I always wondered if I could control something with my moves! It seemed impossible until Machine Learning came into existence. AirMusic generates music from air - without any instruments..!

What it does

AirMusic - Generate music with your moves with this amazing app. Just stand in front of your webcam and dance....and generate music with your moves..!

How I built it

AirMusic is built on top of Tensorflow deep learning model - Posenet, which can estimate your pose in real time. The screen is divided into 8 parts, and each part is associated with a tone from two musical instruments - Drums and Piano. The left half is Drums, and right half is piano. When you place your hand in any of the quadrants, the corresponding music plays..!

Challenges I ran into

First, I tried the model in an android application, but it didn't work as expected. Then I shifted to Tensorflow.js The most difficult part was understanding the model itself since I am not a Machine Learning Engineer, and the time was limited - 5 days. And searching for a efficient light-weight music library was also a challenge. The other difficult task was associating the hand coordinates with the corresponding music.

Accomplishments that I'm proud of

Built the project in 5 days(and nights:))

What I learned

Machine Learning - The best way to learn anything is not going through the docs, but building something. And when time is limited, you explore every possible way to finish the thing without getting into to much details. As it is said, "Work expands so as to fill up the time available for its completion"

What's next for AirMusic

The next thing would be improving the efficiency and adding support for more musical instruments

Built With

Share this project: