rAvR

rAvR is a augmented reality application for music performers and patrons to interact and create a completely unique experience.

rAvR is a one of a kind music application pushing music performers to think about music experiences in a completely new and different way. rAvR focuses on creating a new visual, sonic and interactive experience between the music fans and performers, rather having a one sided musical performance.

No longer are music shows a one way experience where music performers play to a crowd, fans now can actively participate in selecting music, selecting tracks and seeing visuals that could never be physically possible.

Fans can interact with music in various ways. For example, by selecting various blocks and elements on the blocks, which the music performers has created and assigned to tacks, fans can hear different basslines, vocals or samples.

Also, performers can also assign various spaces and locations of a venue with dynamic sounds or visuals. By moving around a venue, fans can physically and mix in music or samples into the currently playing tracks by the performers and hear it live in their headsets.

Inspiration

As a group of musicians, graphics designers and computer scientists, we have been seeking a new ways to experience music and live music.

rAvR came to us, after thinking about how music, music performance and the live music show experience could approached differently in the context of using AR or VR. We also wanted, a new lay of interactivity between the performer and fans, in a way that could never been possible without AR or VR.

What it does

rAvR now allows musicians not only to be creative musically, but allows djs and musicians to curate an entire sensory experience by allowing the fans to participate in the actual music creation.

Djs can now play music and also create innovative ways to allow fans and patrons to participate in the music show by selecting tracks, play samples and interact with a dynamic environments.

How we built it

During this project we used, Unity, Microsoft Hololens and C# to build the various elements in virtual environment. We used Maya, Blender, Cinema4d and Adobe After Effects for the creation of the various graphics and animations in the environments. And we used Ableton Traktor, and Fruity Loops Studio 12 to edit and create music as well as gain inspiration for dj controls and effects.

Challenges we ran into

One of the biggest challenges was considering what environment and device to build on. We initially wanted to build on a VR platform but quickly realized that using and AR setup would allow patrons and musicians to experience the music shows in live and in realtime, as well as experiencing an added lay of sensory depth. Also, this is the first time we had built anything on hololens, so it was a huge learning curve for everyone in the group.

Another challenge, was creating a networked solution for the hololenses. There was loose documentation for networking of the hololenses and so creating custom code to show all the objects in all the hololenses in real time, was quite challenging.

In graphics, understanding how to create actionable content, while creating a interesting environment, ease of use and understandable actions was a challenge we had to over come.

During the production of the sound samples, one major issue we ran into was how the software FL Studio 12 would export .mp3 files with a few milliseconds of empty sound space at the head and tail end of the sample track as well as changing the length of the actual audio in the clip thus rendering it impossible to have loops in the correct time signature so we had to scrap all the samples we had made and make new samples in a .wav format.

Accomplishments that we're proud of

Being able to go from absolutely zero on skill with VR/AR and Unity to being able to make small edits to projects in Unity and understand the workflow involved with making any game or realtime application, was a huge accomplishment for the whole group.None of our team members had experience with the hololens before the hackathon started, so all of our team members needed to learn on the fly and experiment with the development system and hololenses.

Our software engineers are high school students, 16 and 14 years old, and built all of the code on the hololens.

Creating such a in depth project in such a short amount of time, was a huge accomplishment for our engineers especially with such limitations.

What we learned

  • Spacial understanding.
  • Augmented Reality development hurdles including anchor points and user interaction with gameObjects in AR.
  • Optimization within Hololens.
  • Time management.
  • Error driven development
  • Learning what works in the augmented world
  • Working with a diverse group of individuals

What's next for rAvR

  • Network Connectivity, one or more headsets connecting to a base station
  • Graphical optimization for various delivery outlets. Refine the networking to allow a seamless multiplayer experience.
  • Optimizing code and graphics for hololens performance
  • Creating a google cardbord version for those that don't own hololens
  • Graphics upgrades
  • Show effects, light shows, virtual cannons, lasers, ect
  • Djs controls

Built With

Share this project:

Updates