Inspiration

All of our group members were extremely fascinated with technology that has the ability to detect motion gestures and to interact with sound in some cool way.

What it does

LitLeap uses two devices, the Leap Motion and Arduino. It takes gesture data from the Leap Motion and uses that data to play Midi Notes and trigger LED lights on a circuit board controlled by the Arduino.

How we built it

First, W=we used the Leap Motion API to bring hand tracking data from the leap motion and into a Java program. Then we mapped the position of the user's palms to an algorithm that plays midi notes on a scale. Finally, we wired a grid of LEDs to a circuit board attached to the Arduino and had the gesture data mapped to the lights, so that note pitch and tempo is roughly displayed.

Challenges we ran into

We first tried to use recorded audio samples instead of MIDI data, and that was a huge headache because we couldn't run the audio player and the Leap Motion program at the same time (the audio would always trigger at the end of the program, no matter what). We fixed this my switching to MIDI.

Accomplishments that we're proud of

We're proud that we managed to move beyond our original vision of a gesture-to-music program and incorporate a second piece of hardware, the Arduino.

What we learned

One thing we learned is that gesture tracking is even more versatile then we thought. There are a number of devices that have the capability to track motions.

What's next for LitLeap

Maybe we could map the data to a larger grid of LEDs so that we could display more interesting data, or a more detailed representation of the motions of the hands themselves.

Built With

Share this project:

Updates