We feel that there is a void to be filled in the area of user interfaces. This project is a demonstration of a solution to this problem. With most devices relying on physical contact or verbal interaction, there is much untapped potential in a world where gesturing is one of the primary means of human communication. This invention could be particularly useful to those individuals who struggle with verbal communication but is also quite applicable to the everyday user.
What it does
Our program reads and processes visual input provided by the Leap Motion sensor, and compares that to a collection of user-trained gestures. It then translates the input into simple phrases to interact with a simulated home automation system.
How we built it
Challenges we ran into
It was particularly difficult to implement the gesture database and recognition algorithm. We struggled with specific problems such as complex memory management in the low-level SDK, accurate comparisons in real 3-space, and properly saving and loading the database without corruption.
Accomplishments that we're proud of
We're proud of the interface that allows users to calibrate and train the device with ease, as well as the overall functionality of our project. Additionally, our dynamic command management system is an exciting and useful feature.
What we learned
We learned a ton about C++ design and management, including operator overloading. Processing, storing and comparing 3-D vector data from the device was much more complex than we had anticipated.
What's next for Hand Gesture Home Assistant
We would like to improve the user training interface to use a GUI rather than a CLI, and would also like to improve the accuracy of our gesture recognition to allow free rotation of gestures.