Although we got to know about the hackathon bit late and started working on our project even later we thought, what would a music student even late to his class might do in case he forgot his instruments somewhere.

Then we though what about someone who is eager to learn music but cannot afford them.

To help everyone out we wanted to build a software so the user could play musical instruments, without having to get a physical set of instruments

What it does

So we built a snapchat lens where user could directly place the musical instruments anywhere in his room, and play them with is own hands, without any need of getting physical instruments.

When with friends they can directly collaborate by joining a connected space and playing musical instruments in a combined environment

How we built it

We built it by integrating

  1. snap AR's connected framework
  2. Hand tracking framework
  3. Physics body collisions
  4. Integrated 3D augmentation
  5. Integrating music notes

Challenges we ran into

Greatest challenges we faced were linking hand interactions with a AR implemented model. Plus we could not give a touch sensation to human hands.

Accomplishments that we're proud of

Most of the projects on snapAR uses AR to digitalize the human world bringing imaginative objects to human world. We are proud to actually make something that allows human to actually interact with the AR creations

What we learned

Multiple AR modules, how to integrate many of the SnapAR features. and many other things

What's next for Music Jam

WE plan at adding more instruments and give it a better look and user interactions

Built With

  • 3d-augumentation
  • connected-lenses
  • handtracking
  • physics-interactions
  • snapar
Share this project: