Inspiration
Quantum computing is one of the most revolutionary technologies of our time — but it's locked behind complex mathematics and abstract principles. At the same time, music is universal and emotionally resonant. We were inspired to bridge this gap: What if we could make quantum computing feel intuitive, expressive, and fun through music? Thus, NeuroQuantum Composer was born: a tool that transforms everyday user input into quantum-inspired music, allowing anyone to experience quantum behaviors through sound.
What it does
NeuroQuantum Composer is a browser-based interactive music tool that turns mouse and keyboard movements into live musical compositions. Mouse movement affects melody and harmony. Typing patterns control rhythm and dynamics. Behind the scenes, it uses quantum-inspired simulation to generate musical patterns that evolve based on principles like superposition and entanglement. It gives users an intuitive, sensory experience of how quantum systems behave, all through music.
How we built it
Tone.js — for real-time audio synthesis and music generation. Three.js — for interactive, dynamic 3D visual feedback. Magenta.js — for AI-based music style transfer and enhancement. Custom-built JavaScript quantum simulation engine — simulates probabilistic behavior and entangled state transitions based on user inputs. All components run entirely in-browser, with no installation or backend required.
Challenges we ran into
Designing a quantum-inspired system that feels authentic yet understandable. Creating real-time synchronization between input, sound, and visuals. Ensuring low-latency performance for seamless interaction on different devices. Translating abstract physics concepts into creative, musical outputs. Balancing randomness and control so users feel empowered but still surprised.
Accomplishments that we're proud of
Built an entirely in-browser system with real-time audio and visual feedback. Developed a working quantum-inspired simulation that maps physical actions to music. Created a fun and intuitive interface that requires no prior knowledge of quantum computing or music theory. Made quantum concepts accessible to anyone through sound and interaction.
What we learned
How to simulate aspects of quantum computing in a non-technical, interactive context. The power of multisensory feedback (visual + audio + movement) in making abstract ideas tangible. How to integrate multiple libraries (Tone.js, Magenta.js, Three.js) into a cohesive, real-time experience. That bridging science and art can unlock new ways to teach, learn, and create.
What's next for NeuroQuantum-Composer
Mobile optimization for touch-based input. Multiple modes to simulate different quantum principles (e.g., decoherence, measurement collapse). Exportable compositions so users can save their musical creations. Educational overlays that explain the underlying quantum logic as users interact. Partnering with schools and museums to use it as a learning tool for STEM + arts education.
Built With
- ai
- api
- css3
- framework
- html5
- javascript
- magentajs
- three.js
- tone.js

Log in or sign up for Devpost to join the conversation.