Inspiration

We love going to music events and DJ performances because the atmosphere at a live event is far more exciting and enjoyable than listening to music anywhere else. While it is mostly about the music, the visual surroundings have a huge impact on the atmosphere. Most stages have some display that is not used unless the artist brings someone to do it (a VideoJockey). We want to step in and make it easy for everyone to create visuals for live music events.

What it does

Our application analyzes music's tempo, bass intensity, and other metrics to generate real-time, synchronized animated visuals. What it therefore does not do is replace creative people investing hours in complex visuals before a show, so our goal is not to replace creativity with AI. We just want to give everyone a chance to have cool visuals at their events.

How we built it

We built the application with Python and a web interface to make it easily accessible in the browser.

What we learned

We learned a lot about music and how to analyze it with the help of a computer, how to create animations and visuals, and where the boundaries of artificial creativity are.

What's next for KaleidoSync

The prototype is already working very well. The next steps are to add even more variability to the visuals and the automation to make it even better and easier.

Share this project:

Updates