Inspiration
Having now experienced several semesters of online life, we know how much harder it is for professors and speakers to gauge the feelings of their audience without the usual body language. In response, we have created EnGAUGE.
What it does
EnGAUGE is a web app that will provide its user with real-time data about the engagement levels of their audience for any online presentation. It will also give the presenter helpful information about their speaking pace in order for them to adapt live.
How we built it
For the AI sentiment monitoring, we used a pre-trained multi-task convolutional neural network to first identify faces in the video feed, then used a recursive neural network with image processing and trained it with a public dataset (fer2013) that included 7 different classifications. The words-per-minute feature analyses the audio of the speaker as they present and counts the words live using the Google-Cloud API and Python. Lastly, the web app was made using React.js.
Challenges we ran into
After working on another project, we realized that it was not going to work. It was tough to start from scratch much later on in the hackathon but our risk paid off! The Zoom app API was not very user-friendly, so adapting to not rely on it was a challenge we are proud of overcoming.
Accomplishments that we're proud of
We managed to fit a lot of data processing in fairly compact code.
What we learned
We developed our python skills and got much more familiar with AI libraries and techniques
What's next for EnGAUGE
We would love to implement further features such as an auto-scrolling speech as the user presents, or even a summary after the presentation of the best and worst moments for engagement. We could also introduce eye-movement monitoring to determine how often attendees are engaged and looking at their screen.
Built With
- convolutional-neural-network
- fer2013
- python
- react.js
Log in or sign up for Devpost to join the conversation.