Inspiration
Fusion was created to transform how technology supports music learning and recognition. Existing tools often lack a balance between powerful AI and user-friendly design. I wanted to build something that truly understands a musician’s performance and provides real-time, precise feedback to help users improve faster and more confidently. The goal is to make music education more accessible, engaging, and effective for everyone.
What it does
Fusion offers an interactive music training experience where users can practice rhythm and piano skills with instant feedback. It listens to your input, compares it to expected notes and rhythms, and provides personalized coaching. Whether you upload your own songs or choose from the library, Fusion breaks down the music into detailed data and guides you through practice sessions that adapt to your progress.
How I built it
Fusion combines advanced music recognition algorithms with a clean, intuitive user interface. I process MusicXML files to extract note, timing, and tempo information, converting it into formats the app can use in real time. The core AI, powered by Google Gemini, analyzes user input live, calculating accuracy and timing to deliver personalized feedback. This required integrating multiple technologies, from music data parsing to AI-powered evaluation and coaching.
Challenges I ran into
One major challenge was achieving accurate real-time analysis of user input, especially given the subtle variations in timing and dynamics in musical performance. Parsing diverse MusicXML files consistently and converting them into usable data also proved complex. Ensuring the feedback felt helpful and not just overwhelming took multiple iterations of fine-tuning both the AI metrics and the user experience design.
Accomplishments I'm proud of
I’m proud of creating an app that seamlessly blends complex AI-driven analysis with an approachable interface. Fusion’s ability to provide live, detailed feedback that users find actionable and motivating stands out. I successfully built a flexible system that supports custom songs alongside built-in content, opening the door for a wide range of practice scenarios.
What I learned
Building Fusion taught me the importance of balancing technical precision with user experience. Accurate music recognition is powerful only if the feedback is clear and encouraging. I also gained a lot of insight into how musicians interact with technology, highlighting the need for adaptable coaching that respects different skill levels and learning styles.
What is next for fusion
Moving forward, I want to expand Fusion’s capabilities by adding support for more instruments and more advanced music theory exercises. I want to explore ways to enhance the AI feedback with even more personalized coaching and progress tracking.
On the gamification side, I already have a points system that syncs with the progress bar, and I want to build on that by giving users the ability to unlock new songs or helpful hints as they practice. This will make the learning experience more engaging and rewarding, encouraging users to keep improving while having fun. At the end of the day, I want Fusion to be a comprehensive platform that supports every musician’s journey from beginner to expert, blending AI with a motivating gameplay.
Built With
- css
- daisyui
- framer
- gemini
- motion
- opensheetmusicdisplay
- python
- react
- tailwind
- tone.js
- typescript
- vite
Log in or sign up for Devpost to join the conversation.