"Music is a higher revelation than all wisdom and philosophy. Music is the electrical soil in which the spirit lives, think and invents." —Ludwig van Beethoven

Music is the sonic embodiment of culture. Our project explores upon the way that our sensory experiences are influenced and augmented by music. We draw upon a tradition from Indian classical music, one in stark contrast to the Western classical tradition. As Chloë Alaghband-Zadeh describes in her paper "Listening to North Indian Classical Music: How Embodied Ways of Listening Perform Imagined Histories and Social Class," expert listeners known as rasikas use physical hand gestures to express enjoyment in live Indian classical performances. This is in opposition to the Western classical tradition, in which it is expected that the only members of the sonic experience physically engaging in the music are the conductor and the musicians themselves. We seek to bridge the gap between the physical tradition and our modes of listening for Western music.

What it does

Our project provides joy.

It allows users to "conduct" a piece of music in real-time, with the music responding to the rate/intensity of their hand motions.

How we built it

We had three parts to our project: a computer vision algorithm, a real-time audio manipulation algorithm, and a graphical user interface.

The first component was a vision algorithm to track the motion of a hand or a magic wand baton. We used SimpleCV to detect motion and apply various filters (eroding, thresholding, blob finding) to find the position of the hand or magic wand baton. Our project then used math to calculate the beats per minute from the user's conducting.

The second component was real-time audio manipulation. This allowed users to speed up the sound of a song with their hand motions in real time (without changing pitch) or slow down the song. We used PyDub and PyAudio to accomplish this.

The third component was a graphical user interface made in Tkinter. The interface allows a user to select a song, view its BPM (retrieved utilizing the Spotify API), and press play. It also shows the user their hand motions and feedback using a visual metronome.

Challenges we ran into

One of the challenges we ran into was conceptualizing and setting up multithreading across the audio manipulation/sound playback, camera input/vision algorithm, and the graphical user interface. We used Python's threading module to accomplish this. We also had a bit of trouble preventing a sound from adjusting pitch as its speed was changed.

Accomplishments that we're proud of

Integrating with the Spotify API was quite fun! We are happy that the final project works! :)

What we learned

We learned about learning various libraries in Python (SimpleCV, Tkinter, PyDub, PyAudio) as well as multithreading in Python. It was the first time for one of our members to work on a group programming project so we also learned about version control and splitting up/integrating project components.

What's next for MusicMagic

World domination (:

Built With

Share this project: