We wanted to visualize just rap music

What it does

It's an android app that captures the audio output of the android device and processes it in real time to produce an OpenGL visualization of what's playing through the speakers.

How we built it

We built it using Android, Java, and OpenGL. We used Tensorflow to build a machine learning application that can determine if an array of sound bytes is rap or not (unfortunately this is not fully integrated into the app). The audio output of the device is asynchronously captured then converted to OpenGL calls to produce real-time visualizations.

Challenges we ran into

The real-time aspect of the processing provided more barriers then expected, many usual visualization algorithms rely on having access to the full audio clip to perform analysis on beat or tempo. We did not have this at our disposal and had to find workarounds. Working with OpenGL ES turned out to be nontrivial even with prior knowledge of OpenGL, this was a challenge.

Accomplishments that we're proud of

Successfully visualizes music using OpenGL. Runs as Live Wallpaper in the background of android. All processing is done asynchronously off the UI thread.

What we learned

Android, Java, and OpenGL

What's next for Rapulazer

Polish it and push it to market

Built With

Share this project: