Inspiration

Advancements in computer vision have allowed for body data keypoint recognition, being able to detect specific parts of a human's body by providing X and Y coordinates of those specific body parts. This advancement in computer vision can be used for various projects across many different disciplines.

By using this technology, we were interested in seeing the accuracy of being able to detect unique hand gestures, which can eventually lead to creating a system of being able to transcribe signed languages (e.g. American Sign language) into writing.

What it does

Recognizes 9 unique hand gestures (Front Fist, Back Fist, Front Peace, Back Peace, Front Palm, Back Palm, Thumbs Up, Thumbs Down,Ok Sign)

What did we use

Programming Language: Python Software Libraries: OpenPose, TensorFlow Keras, OpenCV, scikit-learn, numpy

How we built it

Firstly, built a dataset by recording our own hand gestures, combining for 9000 total labeled inputs (1000 per hand gesture). Secondly, training a CNN by using TensorFlow Keras. Lastly, creating a demo application, which uses the trained model in order to predict a person's hand gestures.

Challenges we ran into

Limited experience with Machine learning.

Accomplishments that I'm proud of

Building a comprehensive project in 24 hours. Being able to achieve a very well working machine learning model, achieving 96% testing accuracy with minimal loss.

What we learned

Numerous machine learning and data processing techniques. Gained more experience with Python and supporting libraries.

What's next for Machine Learning Approach for Hand Gesture Recognition

Learning various Machine Learning approaches, such as Optical flow based models. Also, creating a system capable of understanding basic signs in American Sign Language, and being able to combine hand gestures with facial expressions for more complex signs.

Built With

Share this project:
×

Updates