Inspiration
Synesthesia is when you hear music, but you see shapes, usually behind closed eyelids. I've been lucky enough to experience this at times throughout my life and so have many others. I want to give a glimpse, albeit limited, of what this experience is like!
What it does
Synesthetic visualizes the sound patterns of any song playing in the background.
How we built it
I used a few tricks from Lens Studio templates to put together this neat filter.
I used the audio analyzer script found on a lens studio guide, which decomposes audio input into distinct frequencies. I applied these transformations onto a 3d mesh sphere made in Blender, and applied a ripple effect across the sphere with an amplitude that corresponded to those intensities, with help of a script found in a code node example. I also included some balls found in the audio decomposition example, and reworked them to bounce around my central sphere.
Challenges we ran into
There are lots of holes in the documentation of Lens Studio, especially when it comes to the code node features, and the code node syntax is a combination of C++ and something else. This made it tough to develop with it. Moreover, lack of proper hardware (my only Macbook is from 2015) made it difficult to run Lens Studio properly.
Accomplishments that we're proud of
A working synesthesia visualizer!
What we learned
I learned how to use Lens Studio's code nodes! They're fun to work with, especially when the integration with snap filters are smooth and at the click of a button. I also was surprised at how well my phone is capable of simulating these objects.
What's next for Synesthetic
More tricks! I want to make the objects more succinctly animate in sync the music. I might also make a filter that solely consists of vibrating / pulsating spheres scattered around the visual field.
Built With
- lensstudio
- snapar

Log in or sign up for Devpost to join the conversation.