Inspiration

Waveon was inspired by the idea that music shouldn't just be something you hear, it should be something you see and physically interact with. We were interested in how people naturally move to music, especially through dancing, and wanted to push that further by turning movement into control. Instead of just reacting to sound, users can actually shape it in real time. This led us to explore the intersection of gesture tracking, audio manipulation, and generative visuals.

What it does

Waveon is an interactive system that allows users to control music using hand gestures while visualizing it in real time. Users can isolate different musical stems, vocals, drums, bass, and others using pinch gestures and adjust properties such as volume, pitch, speed, and audio filters with motion.

  • Hand movements map directly to controls:
  • Vertical movement adjusts volume
  • Wrist rotation controls pitch/speed
  • Pinch gestures select individual stems
  • Swipe gestures switch tracks
  • Fist gesture pauses playback
  • Hands together resumes playback

In addition, users can apply real-time audio filters and effects, allowing them to shape the sound further dynamically. The system also includes real-time crossfading between tracks, creating a smooth and immersive audio-visual experience. On the visual side, a TouchDesigner-powered system responds to movement, with sharp gestures creating glowing, pulsing pixel effects.

How we built it

We built Waveon by combining a web-based hand-tracking system, TouchDesigner for visuals, and a real-time audio engine. The web app processes hand landmarks and gestures from a webcam feed and translates them into control signals for audio playback, stem isolation, and filter manipulation.

These signals are mapped to audio parameters such as volume, pitch, and effect intensity using continuous input values. For example:

\ V = ky​⋅y and P = kθ​⋅θ \

We also implemented gesture detection logic for pinches, swipes, and multi-hand interactions, along with a crossfade system to smoothly transition between songs. Filters and effects were integrated into the pipeline so they can be adjusted in real time through motion.

Challenges we ran into

One of the biggest challenges was making gesture controls feel stable and intentional. Hand tracking can be noisy, and small unintended movements would sometimes cause sudden changes in sound or filters. We had to refine gesture thresholds and apply smoothing techniques to improve accuracy.

Another challenge was designing gestures that felt natural and didn't conflict with one another, especially given the added complexity of controlling filters alongside core audio parameters. Synchronizing real-time visuals, gesture input, and audio processing without noticeable delay was also a major technical hurdle.

Accomplishments that we're proud of

We're proud to have created a fully interactive system where users can control individual parts of a song using only their hands, including applying real-time filters and effects. The seamless integration between visuals and sound makes the experience feel immersive and responsive.

The gesture-based stem isolation, live filter control, and real-time crossfade system are standout features that enable intuitive creative mixing. Overall, we built a system that turns music into something you can physically manipulate and visually experience.

What we learned

We learned a lot about real-time interaction design, especially how important it is to balance responsiveness with stability. Building a gesture-controlled system taught us how to handle noisy input data and map it into smooth, meaningful outputs.

We also gained experience integrating audio effects and filters into an interactive pipeline, and learned how to connect visual systems with audio processing. Most importantly, we learned how to design interactions that feel natural and expressive rather than overly technical.

What's next for Waveon

Now that we've implemented core controls, stem isolation, and real-time filters, the next step is to refine and expand the experience. We want to improve gesture accuracy and responsiveness so controls feel smoother and more precise, even during fast movements.

We're also planning to introduce custom gesture mapping, allowing users to assign their own motions to different controls or effects. Another direction is multiplayer interaction, in which multiple users can control different stems or effects simultaneously, turning Waveon into a collaborative performance tool.

On the visual side, we want to move beyond simple reactive effects into more advanced audio-driven environments that respond to specific frequencies, stems, and filters. Ultimately, we aim to develop Waveon into a polished platform for live performance, creative exploration, and interactive music experiences.

Built With

Share this project:

Updates