Inspiration
In digital education and lectures, presenters often struggle with non-existing non-verbal feedback from their listeners. Those presentations can easily become (extraordinarily) one-sided and boring for the students and the lecturer. One current solution is to use live video footage of the lecturer as well as the students. This comes with a lot of problems though (bandwidth, clarity, and the occasional cat in the background). So, what exactly are the problems we run into using the current solutions?
What it doesn't do
Current solutions to enhance online lectures mainly involve activation of participants' webcams, so that the lecturer and others can use non-verbal communication cues to adapt to the learners’ needs. This might be a viable solution for small seminars and given that everyone can see their hairdresser frequently. But as the corona crisis took this opportunity away from us and we've all been working from home for a few weeks, we sometimes feel that we're in no state to present ourselves in online-meetings anymore. Furthermore, (but probably less importantly) live video streaming of the participants homes provides a unique insight in private affairs that some would like to remain private. And even if no concerns are voiced, this solution often proves to be somewhat unreliable.
What it does
Our application accesses the students' webcams, and detects their emotional state (e.g. happiness, boredom, confusion, sleepiness) using deep learning networks. Anonymized data containing each student's emotional state is then transmitted to the the lecturer and he/she sees a dashboard overview and general mood indication. This provides the lecturer with the opportunity to adapt the presentation in real-time according to the students' state of mind, like they would do in their real-life lectures.
Advantages for lecturers
Good teaching highly depends on the lecturer’s ability to adapt to the audience, e.g. by adjusting talking speed and giving further explanations on the subject. Lecturers often do not only use explicit feedback from students, like questions, but also rely on non-verbal communication cues. ClassAct provides a way to use this information in a suitable, applicable, and easy to use way.
Advantages for students
It is one thing to go to a real-life lecture and a totally different beast to have to live-stream oneself via video in a lecture. We believe (and experienced), that this can cause feelings of discomfort and unease. This does apparently not exactly facilitate learning and understanding. At the same time, seeing that others also look confused or bored, helps us identify with our classmates and feel a little less lost. Our application combines the best of both worlds, e-learning from the comfort of your own home and feeling connected to our classmates.
What's next for ClassAct
- Implementation and use in a TUM lecture
- Stability tests
- Downloadable datasets synchronized with videos, to enable lecturers to improve their presentations even more
- Context sensitive notifications that help you keep your audience alert and interested
Built With
- css3
- deep-learning-networks
- firebase
- html5
- javascript
- matlab
Log in or sign up for Devpost to join the conversation.