With one of our team-members being a DJ themselves, we understand how expensive and tedious setting up equipment is, even more so when you have loads of equipment. These issues make it almost impossible to have a portable on-demand set. So we asked ourselves: "But what if you could play killer sets that’ll make crowds both large and small go absolutely crazy with a device that can fit in your backpack?" This is where AirTunes comes in.

What it does

Upon running the Hololens app, it first asks you to place a cube about 5 feet away from the Kinect. After this, it projects instruments on the floor around you. You can then chose an instrument and play it. On the top right you have a record button that’ll let you record one bar and loop it. Once you loop one instrument, you can swap it out with another one and play on top of the loop. You can repeat these actions until you have layers of instruments playing on top of each other forming a song.

How we built it

Both the Kinect and Hololens were built using Unity. The Hololens app was compiled and ran on the Holoens while the Kinect app was ran on another computer. The server was hosted on the cloud and took requests from the Hololens to speak to the Kinect. And vice versa.

Challenges we ran into

We came into this hackathon without any experience of programming the Kinect or Hololens. Neither of us also had any knowledge of C# which is what was used to program the application. The game engine used to build the virtual environment, Unity5, was also a new tool to us which neither of us have used before. So the first challenge was trying to figure out where to even begin and attempting to learn the APIs without knowing the language. After we got passed the language barrier, and figured how the APIs worked, we ran into another challenge. Normally, if you were to run this just on the Hololens, you’d see the instruments and could move the instruments but wouldn’t be able to efficiently interact with them. If you were to run this just on the Kinect, you’d be able to better interact with the instruments but not see them and have to stare at a screen or guess where they are. The biggest challenge was being able to merge these two technologies and use both to efficiently work together and have the best of both worlds.

Accomplishments that we're proud of

Having the Hololens communicate with the Kinect via our server. Setting the Kinect to recognize our movements to produce sounds. Having our server tell the Kinect which sounds to produce depending on the instrument chosen.

What we learned



Hololens API

Kinect API

What's next for AirTunes

Further develop the application and reach the following milstones: Enable real-time recording for the instruments

Immediate playback of the recording

Better Spatial mapping of the environment

Further refine the skeleton for the Kinect for a more precise control

Ability to save and share your sets

More instruments!

Share this project: