Inspiration
For Feel The Beat, we were inspired by the idea of helping deaf people experience music in a way that most of us take for granted. We wanted to create something that would allow them to feel the rhythm, emotions, and energy of music through other senses. Our goal was to make music accessible in a way that goes beyond just sound, bringing the feeling of music to life.
What it does
Feel The Beat is an app that transforms music into a visual and interactive experience for individuals with hearing disabilities. It analyzes an .mp3 file to pull out the song’s rhythm, emotions, and even colors. Based on that, it creates animations and visual effects that move along with the music, letting users feel the energy and emotion of the song through sight. The app also predicts the emotional vibe of the song and represents it using colors, so users can not only feel the beat but also understand the song’s mood.
How we built it
We built the app using React Native for the front-end to create a smooth user experience. The backend is powered by Django, which handles the analysis of the music, emotion detection, and color prediction. For the music analysis part, we researched algorithms that break down the song into its rhythm and emotional components. We also trained a model to predict the emotion of the song and match it with colors that fit the mood.
Challenges we ran into
The hackathon came with a lot of challenges, especially with the time constraints and the technical complexity. One of the biggest obstacles was syncing the music analysis with the animation in React Native. We had to make sure the visuals were perfectly in time with the music. Another challenge was getting the emotion predictions to match up with the colors accurately. We also had to figure out how to connect the front-end with the back-end in a way that didn’t slow down the app or affect performance.
Accomplishments that we’re proud of
We’re really proud of how well we worked together as a team. Despite facing tons of challenges, we kept pushing forward and managed to build something that was both functional and meaningful. The fact that we created an app that can help deaf people feel music in a captivating way is a huge accomplishment for us. We’re also proud of how we managed to make the most of our time during the hackathon and put together a working product despite the obstacles.
What we learned
We learned so much while building Feel The Beat. On the technical side, we learned how to analyze music and break down audio files in a way that makes sense for our app. We also gained a lot of experience with React Native and animations, Django and Tensorflow. But beyond the code, we learned that good teamwork is key to overcoming tough challenges, and keeping a positive attitude helps when things get stressful.
What’s next for Feel The Beat
For the future of Feel The Beat, we want to make the app even better by adding new features. One idea is to add auto-lyrics so users can follow along with the song as it plays. We also plan to add a feature that lets users adjust the frequencies of the music, so people with partial hearing loss can enhance certain parts of the sound, whether it’s higher or lower frequencies. Plus, we want to make the animations even smoother and more dynamic, so the overall experience feels even better. One more idea could be an AI generated picture in the center, which would represent the lyrics in a creative way.
Built With
- django
- react-native
- tensorflow
Log in or sign up for Devpost to join the conversation.