We came into Hack the North with an idea with an array of cameras that could classify ASL letters made by a human hand. This would allow for sign-to-text and then sign-to speech, which would really improve the lives of those who are deaf or mute. When we got to the hackathon and took a look at the API prizes, we noticed that Zesto had a platform very similar to that, and it seemed pretty easy to use. With no better idea, we quickly formulated an idea that made use of Zesto's fluid motion classification. We all decided that it would be really cool to pretend to grab an item in Autodesk Inventor, and rotate it as if we had it in our hands. We set out from there.
Grip3D allows users to interact with 3D models in an incredibly intuitive way; using hand gestures. The Leap Motion senses hand gestures and rotates or zooms in on the autodesk inventor object accordingly.
How We Built It
Challenges We Overcame
Accomplishments That We're Proud Of
We certainly all worked very hard on this project, and we persevered through a lot of problems that seemed impossible (the Xesto Wave API not working, for example). I think the whole team can be very proud of that. We had to change directions frequently, but we never gave up. We were slowed down significantly, but we're still trying to make the deadline.
What We Learned
Have backup plans for any system feature. If it turns out something you thought you could use doesn't work, is depreciated, etc., you need to pivot quickly to save time. Assume the worst when it comes to time management. Take shifts for sleeping. It's hard to get everyone involved on a coding project, especially when one error is holding the whole thing up. Sleep is also important for personal health and problem solving skills. Sleeping can actually be the most productive thing a team member can be doing. We all learned new things from each other, since we came into this competition with some fairly different skills. There was a bit of a language exchange going on in our group for sure.
What's Next For Grip3D?
Expanding Grip3D to other programs would not be that difficult. We could map gestures to the Adobe Suite, and many other programs that have some sort of 3D or 2D focus. With a more complex gesture analysis program, we could also vary the intensity of the output. For example, moving your hand just a little bit would rotate the model ever so slightly, while exaggerating the gesture would spin the model considerably.