Inspiration
After seeing that it was possible to create a VR application fairly simply on the web, we wanted to create something really cool with the Rodin framework, since it allows for cross-platform web-based VR development. The concept that we came up with was audio visualization: a three-dimensional circular audio spectrum being the core concept. The backdrop for this experience is one of Rodin's template rooms, which provides a somewhat crude simulation of what this might be like in augmented - rather than virtual - reality.
What it does
The core concept is fairly simple. It builds on one of Rodin's templates to load in an audio file and perform the FFT on parts of the audio to obtain the spectrum from moment to moment. A series of rectangles are then updated each frame, with the end effect being a three-dimensional audio spectrum that surrounds the camera.
How we built it
The initial prototype was built using a simple WebVR boilerplate discovered on the Internet, for which I am immensely grateful. This allowed for tremendously fast prototyping using a simple locally hosted web server, which made it very easy to connect to the server from my laptop itself, or from another device on the same network, without having to upload and download constantly from Rodin or fiddle with npm and gulp.
Challenges we ran into
Actually getting the audio to work was rather difficult. Playing an audio file in a standard browser is relatively easy; playing it in a 3D VR context is much more difficult. It was also difficult to use a framework that we had never used before, since most of our experience is in non-web languages like C++ and Python. JavaScript has some syntax similarities to C++, but getting all of the files to play nicely together was quite a challenge.
Accomplishments that we're proud of
The thing that I'm most proud of is getting this working and running decently. On some hardware that's not particularly optimized for VR (my phone in particular) it seems to run a bit sluggish, but on decent VR-ready hardware - and even my 2010 MacBook - it seems to run fine. It's really cool to see how powerful and extensible JavaScript really is in this context.
What we learned
One of the main things that we've learned is how to work with JavaScript, npm, and the web in general. Working with the Rodin framework itself is also something that we've learned, although I'm sure that there's a lot else that we didn't even so much as skim the surface of. Another thing that we learned is just how much it's possible to accomplish in a 24-hour timeframe, since this was our first hackathon and we didn't really expect to make something this cool.
What's next for SpektrumAR
I think there's a lot of places that we could go with this. Currently, SpektrumAR just plays one audio file (just something from my music library) and loops it once it reaches the end. There's many better ways of doing this; one could make a playlist available, integrate with SoundCloud or the like, or even draw from the user's local media library. Admittedly, each successive item in that list becomes more difficult in a mobile VR context, since a web app doesn't have a great way of interfacing with the user's local file system. I'd also like to add more ambient effects and the option for the user to customize the visualization a little bit. The pie-in-the-sky goal is to make it an augmented-reality experience, so that it can be an overlay over the user's own environment, so that the user could lie on their bed at home and see music around them.
Built With
- html
- javascript
- rodin
- webaudio
- webvr
Log in or sign up for Devpost to join the conversation.