Inspiration
Commercially, numerous music visualizers exist including desktop applications such as the Windows Media Player and the iTunes visualizer. However, a significant number of these visualizers often have trade-offs between displaying meaningful information from the audio signal and being aesthetically pleasing. For example, the iTunes visualizer is based on waveform input of the audio file and can have potentially misleading and uninformative visuals. Spikes in amplitude make the iTunes visual more elaborate and fill more of the screen while differences in frequencies are depicted by the size of the charged particles. How the audio input impacts the visualization is not clear. In fact the iTunes visualizer’s selection of parameters such as color tend to be arbitrary.
Furthermore, desktop visualizers (such as iTunes and Windows Media Player) are strictly limited to the computer machine. For commercial applications such as enhancing concert experiences, using a mobile platform makes more sense due to its accessibility by a wider audience. Most mobile devices are equipped with sensors for location detection as well as rudimentary motion analysis such as pedometer functionality. Both the iPhone and the Android have accelerometer and gyroscope sensors as well as software based sensors such as the step counter. With these sensors as well as the GPU's computing power for augmented reality visualizations, a far more immersive mobile-based music visualizer can be created to enable a synesthetic experience for applications such as enabling the deaf to experience, motivating exercise to music, and enhancing music performances.
What it does
This iOS app visualizes music as rings of spherical shapes surrounding the user. A user can first select a song from their iTunes library to visualize. The spheres pulse according to the frequency magnitude and change color according to acceleration. The chamfer radius changes according to gravity and the overall shape of the rings change when a new activity (ie. walking, running, stationary) is detected. A user can create custom visualizations and specify several parameters including the number of rings, radius, etc. A user can also choose to gamify the visualization, in which case the goal is to "hit" all of the surrounding sphere nodes–when the nodes are hit, a particle collision is visualized and the node drops to the ground and disappears.
How I built it
The app was built in the iOS platform in Swift. I primarily used Apple's ARKit and SceneKit frameworks for the visualizations, and Apple's Accelerate framework to compute the Fast Fourier Transform of the audio.
Challenges I ran into
As ARKit is still relatively new, there was some unexpected behavior in terms of how nodes were positioned with respect to the user. Sometimes positions would not stay consistent and I found it difficult to resolve these issues. There were also tradeoffs that had to be made to render the maximum amount of visuals without causing considerable lag.
What I learned
This was one of the first apps I built with ARKit so I learned quite a lot about ARKit.
What's next for MotionAR
I am working to improve the gamification aspect of the app as well as build more aesthetically pleasing visualizations.
Built With
- arkit
- swift
Log in or sign up for Devpost to join the conversation.