AirTone Over the past 24 hours, we built an interactive music generation tool that combines computer vision and gesture-based controls to create a seamless, hands-free musical experience. Our program tracks multiple landmarks on both hands in real time, allowing users to play notes by selecting key points with their thumbs. The movement of the selected key point dynamically adjusts the volume, making the interaction feel intuitive and expressive—similar to playing a virtual violin.
To enhance the user experience, we developed a web-based interface that connects to the webcam, providing real-time visual feedback. Users can see themselves interacting with the system and experiment with different notes. The interface also features responsive animations that track the selected landmarks based on the hand and note being played.
With both hands, users can even create chords, opening up the potential for richer and more complex compositions. Our goal was to make music creation more accessible and immersive by leveraging computer vision in a way that feels natural and engaging.
Log in or sign up for Devpost to join the conversation.