Everyone loves music and we all wish we had the ability to magically produce beautiful music from our hands at the level of Mozart or Beethoven. However, not all of us have had the luxury of years and years of classical training and, as anyone who has knows, learning to master an instrument is hard.

However, Jazz Hands completely shatters this forgone conclusion. Using computer vision, we have developed a web-based application that allows users to generate music by simply moving their hands, empowering everyone to create the beautiful tones that were once limited to only their imagination.

Built on top of React and Flask, we leveraged the OpenCV-python library to track hand movements and connect said movements to the production of sound using JavaScript libraries.

We faced many challenges throughout the process, including developing a filtering schema to isolate the gestures of hands and fingertips, efficiently streaming webcam footage to a server to be processed by our schema, and designing an experience that felt as natural as listening to music.

We are really excited to share with all of you our hard work and hope many more can now enjoy the wonderous gift of music.

Share this project:

Updates