I came up with this idea while casually sipping a cup of coffee and relaxing in the HackNotts Discord, wishing that I could somehow make listening to music in the Discord voice channel feel more alive and fun. That's how I came up with AlgoRhythm, an awesome music web app that vibes and reacts to your music. I also wanted to control the music player in a cool and futuristic manner, so I created some gesture detection algorithms for the Tensorflow.js machine learning model "HandPose'' in order to adjust the volume or pause the music using a few simple gestures.
What it does
Finally, users can adjust the volume or pause the music using gestures, which are detected by some algorithms I wrote that utilize finger positional data from the Tensorflow.js model HandPose. This feature consumes a lot of memory and GPU though (which significantly decreases frame rate and performance of the web app), so by default it is turned off. Users can easily enable the gesture detection features if they choose to do so.
How I built it
Challenges I ran into
I ran into lots of challenges when it came to creating an interface for users to drag and drop their music files, and then process that data using FTT analysis. However, with some creative use of external libraries, I was able to solve this problem. It was also quite challenging having to finish the entire project solo, but I pulled through in the end with dedication and persistence.
Accomplishments that I'm proud of
I wanted to build something awesome and fun for HackNotts, and I think I succeeded with these tasks. I also learned a lot about the physical
What I learned
What's next for AlgoRhythm
I'd love to optimize the program further for better performance when gesture classification is toggled on. Additionally, I would like to take the design of the user interface to the next level, and make a system where other individuals can join a "music room" and vibe with their music together.