Inspiration
Traditional DJing relies heavily on expensive, bulky and complicated hardware, which can limit accessibility and creativity. We wanted to break that barrier by converting physical controls into a fully software-driven experience. Our goal was to make DJing more intuitive, immersive, and accessible, where anyone can control music and visuals using just their hands, without needing specialized equipment.
What it does
Noise is a motion-controlled DJ system that replaces traditional hardware with gesture-based controls. Users can change audio parameters like frequency and resonance in real time using hand movements, while also generating synchronized, audio-reactive visuals. This creates a fully immersive rave experience powered entirely by motion.
How we built it
We used TouchDesigner to handle gesture recognition, user interface, and visuals, while Ableton Live for audio processing and send real-time data. Using TDAbleton, we connected the two platforms to enable real-time communication. Hand gestures are tracked and mapped to audio parameters, allowing users to control and fully immerse themselves in the music. An audio detection system also drives synchronized visuals.
Challenges we ran into
One of the biggest challenges we faced was minimizing latency and delay in signal response, as even a slight lag between gestures and output can disrupt the live experience. Ensuring real-time responsiveness required careful optimization across systems. Initially we had TouchDesigner on a device and Ableton on another device. Connecting both softwares with eachother was definitely a challenge. However we managed to minimize latency while also syncing both softwares by shifting our completed workflow to one laptop.
Accomplishments that we're proud of
We successfully created a fully functional gesture-controlled audio-visual experience that replaces traditional DJ hardware with intuitive motion-based interaction. We are especially proud of achieving synchronized visuals that react dynamically to the music, enhancing immersion and making the performance feel cohesive and engaging.
What we learned
Through this project, we learned how to handle real-time system challenges such as latency and responsiveness, and how critical they are for live interactive experiences. We also gained experience in integrating multiple platforms, specifically connecting TouchDesigner with Ableton Live, and developed a deeper understanding of how to map physical gestures into meaningful digital controls in a way that feels natural to users.
What's next for Noise
Next, we plan to expand Noise by introducing more advanced interaction methods such as voice control and a wider range of gestures for actions like play/pause, transitions, and effect switching. We also aim to improve gesture accuracy and reduce latency to make the system more reliable for live performances. In addition, we’re exploring the development of a mobile app that would allow users to customize controls and settings remotely, as well as enhancing the visuals to create an even more immersive and scalable experience for larger stages and installations.
Built With
- ableton
- python
- touchdesign
Log in or sign up for Devpost to join the conversation.