Inspiration

My first inspiration for 'Sentimeter' came through the online lectures that I am currently attending owing to the COVID 19 outbreak. I would often see professors and teaching assistants struggling to read the response of the students. Seeing 50 images of people in small boxes is too complicated for any human brain to perceive. This leads to a disconnect between the students and the professor where the professor is often clueless about whether the students are understanding the concepts being taught. I realized that the core of this problem lied in the inability of the speaker to perceive non-verbal feedback from the spectators. This problem exists not only in education but also in all such areas where a large number of people are to be virtually interacted with.

What it does

Sentimeter bridges the gap of non-verbal feedback, most often in the form of facial expressions, and allows the speaker/ performer to get an idea of the aggregate emotional state of the room. This replaces the need for non-verbal feedback and rebuilds the broken chain of communication.

How I built it

I used the python-opencv package to capture frames from a live streaming video. Then, I sent real time requests to the Google Cloud Vision API to detect faces in the frames and identify key features in the faces including likelihood of facial expressions (joy, sorrow, anger, surprise). By parsing the response I received from Google Cloud and setting discrete numerical values for the varying levels of likelihood for each expression, I was able to derive a numerical score for each expression on each face. I aggregated these scores and showcased the results visually in the form of lateral bars, on the top left of the screen.

Challenges I ran into

I had initially planned to train my own program to identify these expressions using an open source facial database. But upon realizing that Google Cloud would offer much higher accuracy because of the vast resources at its disposal, I switched to using Google Cloud. Since, I was using Google Cloud for the first time, it was somewhat challenging to figure out how to create a pipeline for my photos and communicate to Google Cloud. I eventually figured out how to do it by following a google help guide.

Accomplishments that I'm proud of

I am proud of the fact that I finished my project and its prototype successfully. And I am proud of the fact that I was able to successfully change a major part of my plan midway while being under pressure.

What I learned

I learned about the Google Cloud APIs and how to use them. I also learned about some neat open source libraries that helped me in crucial parts of my projects. On a more abstract note, I learned how perseverance and determination are crucial in seeing a project through to the end.

What's next for Sentimeter

I have big plans for Sentimeter. The biggest of them are - first, creating a robust UI for sentimeter and second, integrating NLP with Sentimeter so that it can not only analyze feedback but also predict it.

Built With

Share this project:

Updates