We were inspired to improve how presentations can be presented because as students we frequently give presentations and need to return to our computers every time we needed to change slide. We built a project that allows the user to change slides and perform other commands using hand gestures while providing the presenter with the freedom to move away from the computer.
What it does
Our program tracks the movement of a hand and will perform various keyboard commands in a PowerPoint presentation. For instance, when our program is running on top of a PowerPoint, if you need to move ahead one slide in the slide deck you need to position your hand in front of your webcam (on the laptop or wired) and move your hand to the right.
How we built it
We built the project using a hand tracking library that allows us to initialize the position of a hand. We coded the analysis of the movement of the hand and the keyboard shortcuts as well as numerous safety measures to prevent the program from failing during the presentation.
Challenges we ran into
We initially used a library for hand tracking that did not use machine learning and it worked well, however, it was based off of color difference and edge sensing to determine the position of the hand. As a result it was rather difficult to calibrate the system, so we attempted to implement a TensorFlow based system.
Accomplishments that we're proud of
As seen in the video at about 50 seconds we managed to accurately change slides in a slide deck using hand motions. We are actually excited about the progress we made in BluePrint, and no matter the outcome plan to improve our system so we could use it in the classroom. Moreover, half of the team did not know Python before the competition and now we are all at least semi-proficient in python.
What we learned
We learned python and various aspects of the OpenCV library as well as how to make a python script communicate with another program running at the same time. With this our team members can now use these skills in future projects.
What's next for Advanced Presentation Control Using Hand Tracking
The next logical step would be to increase the range of the camera tracking so the presenter not only will not have to touch the computer, but will not have to be near it in general. Additionally, more features of hand tracking can be implemented to improve the functionality of the program.