Inspiration

We were inspired to make a visual audio synthesizer by the art installation Cloud Music by Robert Watts, David Berhman, and Bob Diamond (seen here and here). While their piece was intended to soundtrack the gentle, drifting pace of the clouds, ours allows the user to point it at any image they would like, such as a video, a busy highway, or, yes, even the sky.

What it does

cmpp reads in visual data from the user's laptop, focuses in on six points, and generates a tone based off of the color value of each point.

How we built it

p5, react, node.js, and a whole lot of pivoting

Challenges we ran into

Initially, we tried making this a mobile app, but the framework we were using proved to be too limited for our needs. It was even more unfortunate that we discovered this with only hours to go! Pivoting to a web app came with its own problems, mainly React and getting components to talk to each other.

Accomplishments that we're proud of

Managing to make something with p5 and music production, two things we've never tried before!

What we learned

p5 is pretty easy but not as easy as it seems. Also, it's hard to make sine waves sound pretty.

What's next for cmpp (#17)

MIDI integration: enabling the reading of MIDI files for playback and the generation of MIDI files for use in audio production. Also, we'd like to integrate traditional video analysis tools, such as playing back and processing a YouTube video.

Built With

Share this project:
×

Updates