To be completely honest, when we got to Hack the North we had no idea what we wanted to work on, the Myo was the only thing we had agreed on up to that point. We came to the opening talks and were just hoping we'd be able to come up with something once we got back to our project area. We were talking as a team trying to decide what would really blow the judges away and that got us nowhere for a long (long long long long), so we just started talking about what we all wanted to get out of the hackathon and we had an idea before we knew it. Something challenging, something we all really wanted to make happen so thats where the controller idea came from. I'd noticed that at work I'd seen a lot of very professional and otherwise well done presentations fail just because of the interfacing happening with the solid models; it was clumsy and cluttered. So we decided we'd start work on an interface that would allow the presenter to manipulate the essentials of a solid model (for now) in a much more intuitive and overall cool way!
We've written our script using Thalmic Labs API for the MYO which allows us to work with a lot of very well defined gestures. We used this API to pan, zoom, rotate, in addition to several in production features such as cutting faces for demos, fixed axis/point rotation. We're really happy with how cleanly our project is working but we'd love to take this further than just the Thalamic API, and use the actual Solidworks API together with it. It would allow us to make the functionality incredibly more extensive and then maybe integrate even more modeling software. Who knows? But this was a heck of a lot of fun; definitely going to take this project further!