We wanted to bring the pure ecstasy and adrenaline one gets when jamming out on their favourite instrument to a augmented reality environment. The unbeatable pleasure of creating music and art is no longer only exclusive to experienced musicians.
What it does
Through hand motions gestures, our hack simulates playing instruments in an augmented reality environment. You can play the piano, drums and bongos to rock out all by yourself or with friends or challenge others in a head to head battle.
How we built it
Using leap motions hand tracking sensors and Unity's physics engine, we created a collision detection system with virtual instruments which relays each note to a back-end server. Using node.js, we built a event-driven system which dynamically responds, receives and relays data to the clients using our back end server.
Challenges we ran into
In order to build a dynamic back end server, we needed to establish two-way communication using web-sockets. The lack of documentation online led to challenges when implementing this into node.js and unity. The network connections to the back end server had to be debugged and refined extensively to achieve the final product.
Accomplishments that we're proud of
Implementing leap motion which none of us had experience with before to optimize a collision detection engine.
What we learned
We all learned how to use technology we never knew anything about before. Tyler and Charlie Zhang learned to use unity physics engine. Ishan learned to develop a back end server using node js. Omar learned how to use the leap motion to create custom applications.