We did the mash. We did the monster mash.
We were inspired by the idea of creating generative 3D monsters based on user input through some sensor.
What it does
The user wears two flex sensors (one on each pointer finger, or some other combination) and they can bend and flex their fingers. The arduino code provides integer values that the python script translates into formulas that are then plotted and updated every 50ms.
How we built it
It initially began as a way to test if the flex sensors would work as input for a blender model. We built a circuit to test if the flex sensors worked using LEDs. After initially trying to get data values in MATLAB, we moved to python and endlessly Googled/asked for help on how to develop a code that would store the values from the flex sensors to be used in another program. After we realized adding another program into the mix wasn't a great idea for the hackathon, we tried making a variety of live graphs that represented the finger motion visually!
Challenges we ran into
I had never used flex sensors or coded in python so figuring out syntax and circuitry was a struggle. Our idea also went through countless iterations as we tried to hack with various existing softwares (like Blender & Unity), so we had to do a lot of trial and error with various programs.
The initial end-goal with python was to make polar plots that resembled spiralographs, but getting the timing/updating/formulas to make smooth curves proved elusive. So, we decided to take advantage of all the progress we had made and ran with the angular, jagged motion.
Accomplishments that we're proud of
Learning python (kinda, at least a little)!!! Using flex sensors!!! Iterating!!!
What we learned
Erin's computer won't let her download the arduino package for MATLAB :/ How to make a real-time updating python graph! Learned finnicky things about Unity
What's next for Digit Dancing
To the 3rd dimension, bb!