We came in Hacktech motivated to create a novel and fun project, with an emphasis on fun. Given that most of us didn't have prior experience with hardware hacks, we picked out Kinect for a motion-related application. One of the team members introduced Sonic Pi - a music coding environment, and we played around with it, ultimately deciding that it was fun enough to make the cut to integrate with Kinect, and Kinect the Hedgehog was born.
What it does
Takes in user action to produce music. Users can customize modes (bass, drum, synth), playstyles (loop, freeplay), adjust pitch and frequency, and presets within each mode to create music through pure motion.
How we built it
We connected Kinect to our laptop and used Python to call Sonic Pi sounds and program Kinect gestures. With Python, we mapped each gesture to a certain feature, and the specifications of each feature (such as controlling pitch or frequency) was manually implemented as well.
Challenges we ran into
Learning different gestures in Kinect and how to program them to a feature. Building a UI overlay with Pygame that works with Kinect. Making sure the math checks out so the axis reflects our intended motion.
Accomplishments that we're proud of
Learning kinect gestures and programming it to correctly trigger different features.
What we learned
Drawing out the prototype is extremely important; we started to get going when we all understood the blueprint and knew the main functions of the app.
What's next for Kinect The Hedgehog
We hope to improve the UI so it's more user-friendly and polished. We also hope to implement more features (more instruments, ability to delete sounds), and fine-tune gestures.