As young musicians we were often encouraged to express our emotions by thinking of the various colors and textures of the piece we were learning. We were inspired by the perceptual phenomenon of synesthesia and wanted our users to explore this aspect of musical experience. Hence the name con moto - it is an Italian term often used in music to mean "with motion."

What it does

We pull song data from EchoNest and use their attributes to calibrate the sliders and the initial color scheme of the Canvas visualizer. Users can then slide the sliders up and down to play with the colors of the visualizer as well as its volume, saturation, and biquad filtering.

How we built it

We used WebAudio API to manipulate the audio and EchoNest to pull the song data. Our framework relied on Javascript and CSS.

Challenges we ran into

Apparently, many music APIs do not give you direct access to the actual audio. Manipulating the audio was quite difficult given the complexity of its data representation. Coming up with unique interactions was also difficult.

Accomplishments that we're proud of

Our demo works! Ideally, we'd have more audio to work with and more features to incorporate, however we are really proud of how much we've been able to integrate in the past 36 hours.

What we learned

Sometimes you have to go down the wrong path about 4 times before you realize what you actually want.

What's next for Con Moto

It would be cool to add more interactive elements to our visualization. Manipulating geometric shapes to produce different musical effects is definitely something we'd like to develop in the future.

Built With

Share this project: