We love music - we love playing it, listening to it, and we wanted to build a way to interact with your music without physical limitations. We introduce Oki, a free-form motion-based 3D DJ System to allow people to create music without obstruction.

What it does

Using two motion controllers - the Kinect and Leap Motion sensor, we add a layer of interaction to music via hand gestures and 3D positioning. Features of Oki include volume control, frequency and playback rate re-sampling, and squares. lots of them.

How we built it

We used OpenCV, Kinect, and leap-motion libraries on top of Processing with Java to give the user a visual, interactive experience. We implemented computer vision algorithms such as detecting finger point locations using convex-concave defect analyses and used that information to toggle samples in Kinect. We also implemented volume control, increasing and decreasing the volume of played samples based on the z value position of our hands. Finally, we used gesture recognition on leap-motion to play toggled samples, fully integrated with the Kinect System.

Challenges we ran into

Consistent hand and body detection on the Kinect was prone to failure - often times hands would be detected as closed or open when they were the opposite, affecting the usage of the interface. Not having updated resources to the Kinect API for Windows 10 was additionally what drove us to use Processing.

For the leapmotion, we we tried recognizing individual finger recognition, but we found that the delay for gesture recognition was too big for single fire sample controls where delays must be minimized, so we went for a toggled control system with pinching and fist grasping instead.

Accomplishments that we're proud of

  1. It's dope
  2. Individual finger-point geometry detection
  3. Re-sampling music based on distance between two closed-fists
  4. Volume control

What we learned

We learned a lot about computer vision techniques, gesture recognition, and music controllers - in addition to working with Kinect and Leap APIs.

What's next for oki

BPM detection for easy snapping. Hotswapping different samples. Expanded music controls in interface (frequency controls, etc.) Higher fidelity visuals. Individual finger functionality for leap motion (we tried, but it wasn't responsive enough to be worth it)

Share this project: