Inspiration

Virtual meetings have become an integral part of everyone’s life. From college lectures to dance tutorials, almost all activities are being conducted via online streaming services. One such service that is being widely used is Zoom. Instead of using the conventional Zoom GUI, We decided to add some fun to daily zoom meetings.

What it does

MeetingAI is an OpenCV model that has the ability to detect hand gestures such as palm, OK, L, Peace, etc. The script executes the model and using the highest probable prediction it controls the zoom application running on the computer. The script can mute/unmute the person using that computer just with a hand gesture.

How we built it

Using transfer learning, we are using a pre-trained deep neural network that has been trained to detect hand gestures. The model achieved an F1 score of 0.74 overall. The input to the model is provided through the computer’s webcam with the help of the OpenCV module. Furthermore, the type of output triggers the necessary action in the script. For example, ‘Ok’ gesture detection allows the python script to unmute/mute the meeting. This is done via using the ‘pyautogui’ module.

Challenges we ran into

OpenCV had installation errors in one of the team members’ computers. The lighting in the room induced a few errors while testing.

Accomplishments that we're proud of

Using hand gestures to control on-screen buttons has many use cases and advantages. Integrating one of those use cases was a great learning experience for us. Keyboard-less control of on-screen applications have

What we learned

We learned the applications of OpenCV in real life. We also learned about controlling on-screen buttons using python modules.

What's next for MeetingAI

The next step would be to map additional gestures to the remaining controls on the application.

Built With

Share this project:

Updates