Inspiration
What it does
It uses machine learning to determine what hand gesture you are making and opens / closes processes based on what gesture you made
How we built it
The GUI and ML modeling were developed seperatly
Challenges we ran into
Determining how to close applications
Accomplishments that we're proud of
Only having one merge conflict in requirements.txt Having it be cross platform instead of windows or linux only
What we learned
How to make a gui in pysimplegui
What's next for Hand gestures
Attaching the GUI to the model and letting the app close processes
Built With
- pysimplegui
- python
- tensorflow
Log in or sign up for Devpost to join the conversation.