Inspiration

For the past 11 months, the world has been in quarantine, and many people have switched from in-person learning and meetings to online classes and conferences. However, distance learning and online conferences present multiple core issues that make the experience less intuitive and more stressful and frustrating. I built the Simple Human Action Processing Engine for Videos (S.H.A.P.E Videos) to solve these issues and create a more natural online conferencing experience.

What it does

S.H.A.P.E Videos recognizes hand gestures using OpenCV and produces a specific output per hand gesture. On a video conferencing call, instead of looking for the unmute button during a call and losing your train of thought, you would show a simple hand gesture to the camera, and then the program would increase the volume, raise your hand, mute, unmute and other things, creating a more natural experience for your online video call.

How I built it

I coded my project in Python and used libraries such as OpenCV, NumPy, imutils, and scikit-learn. I used OpenCV for video capturing and manipulation, NumPy for data manipulation, and imutils/scikit learn for useful functions.

Challenges I ran into

The most challenging part of the project was detecting how many fingers were shown from a segmented hand image. I had multiple ideas on how to do this and had to try and fail multiple times before I got a method that worked.

Accomplishments that I’m proud of

  • Learning what was necessary for the programs as contours were complicated to me at first
  • Comprehending what I learned and turning that knowledge into a cohesive program
  • Building a successful program that detects hand gestures and helps the online learning experience.

What I learned

I learned to perform complicated video transformations using NumPy and OpenCV.

What's next for S.H.A.P.E Videos

  • More types of hand gesture tracking
  • Improve accuracy
  • Add real-time ROI refreshing

Built With

Share this project:

Updates