E-learning presents a number of challenges for teachers, one of which is keeping students engaged in class. However, it is difficult to measure engagement, let alone encourage it. To address this issue, teachers need a way to gather information on their students’ attention during class. This data can then be used to encourage engagement through gamification elements and address individual students without involving the whole class.
What it does
Teacher's Pet is an open-source web application that allows teachers to do track student engagement during Zoom calls.
Students run the app alongside their Zoom class, collecting information about whether they’re looking at the screen and their emotions. The teacher can then use this data to measure engagement and general feedback, such as how many kids showed confusion or happiness throughout class.
How we built it
The video analysis portion was built using a combination of OpenCV, dlib, and Keras. OpenCV and dlib were used to determine whether or not the user is looking at the screen or not. Keras was used to perform emotion detection to determine what emotion the users are currently feeling. emotional analysis code and the eye tracking code and built a flask server around them which takes images in as POST requests and returns the emotional and eye tracking analysis to allow us to easily deploy and integrate all of our code. Our client-side app takes a screenshot and sends it as a post request to the flask server. On the server side, we experimented with OpenCV on on various zoom screenshots to extract individual frames. We later comverted this code to work with flask and segment all incoming images from client-side and put them either in a bucket or local folder and then send them for further processing using Redis queue.
Challenges we ran into
Some challenges we experienced were that we wanted to use bentoML to deploy our keras model however we did not have enough time to finish that component. Additionally, we had issues finding good training data to use to determine higher level emotions such as confusion.
Accomplishments that we're proud of
We're proud of using Keras and dlib to accurately recognize eye location/direction as well as emotions.
What we learned
We learned how to build ML models that have real use cases as well as how to integrate machine learning models into our larger codebase.
What's next for Teacher's Pet
In the future, we'd like to expand Teacher's pet to other video chat clients like Google Hangouts and Skype. Additionally we would like to expand our training data to determine higher level emotions.