Inspiration

I've been suffering from post-concussive syndrome for a couple of years now, and I've really had my eyes opened on how disabled people deal with society, so I've really wanted to do something useful with my knowledge and build something that could be _ used _.

What it does//How I built it

Hand Gesture Recognizer first finds a green band worn by the signer, then isolates a rectangle used to find the hand. Then, using contours it is able to find the edge of the hand, and by calculating the derivative of the edge, the number of raised fingers can be found.

I built a Machine-learning/Logistic Regression algorithm so it could learn specific hand signals, however, I did not have the time to train it (about a week :p).

It then speaks the number of raised fingers – hopefully later it can work as an actual sign-to-speech program.

Challenges I've Faced

Computer vision is an extremely difficult field, and tying in Machine Learning was extremely difficult. It took half the day for the computer to even find my hand and the other to count my fingers, and yet still it has difficulty finding my fingers.

What I learned

I've never used OpenCV before, believe it or not!!

I've learned a lot of OpenCV, and much about myself and the journey I'm on through high school

What's next for Hand Gesture Recognition for ASL (Team #4)

I'll implement the machine learning algorithm and teach it some signing!!

Share this project:
×

Updates