Harris didn't have access to an instrument and was desperate to make some music.

What it does

Allows you to play a guitar using fully configurable notes and chords using your webcam to track the position of your hands.

How we built it

We used python and opencv to send a video stream to a TensorFlow.js model running on a node.js server to track the position of your hands and shoulders. The python front-end then parsed the model output and played notes/chords depending on the velocity and position of your hands.

Challenges we ran into

  • Lack of VRAM
  • CORS
  • Sending data to/from back-end
  • Maths ("I still don't get how the maths works" - Harris Mirza)

Accomplishments that we're proud of

Actually getting this to work.

What we learned

  • CORS is annoying
  • OpenCV is far superior to html canvas and webRTC

What's next for AiRGUITAR


Built With

Share this project: