Inspiration

Online lecture has impacted all of us, and in particular as a team of STEM students we have noticed that lecturers now becomes very monotonous and non-interactive. A recorded lecture basically becomes a monologue with a powerpoint in the background.

We think that a small change to make the lecture more interesting, no matter how minor, would have a positive impact on the student's level of engagement to the lecture. Therefore we came up with the following idea:

What it does

The lecturer's act of writing on a blackboard alone is enough to increase the students' level of engagement as their mirror neurons get activated and thus increase the student's retention of the lecture content.

Our plan is to replicate this in a virtual environment: a figurine representing the lecturer would be used to write out these equations as they are loaded from a PowerPoint-like presentation interface. The figurine will have the face of the lecturer "transplanted" onto it, while its hand will write out the equation block or text block stroke-by-stroke as it is loaded up onto the screen, instead of using basic swipe/appear/fade etc. animation.

How we built it

Most of the project was based in Python as this was our most familiar language. Vishnu made a latex-to-png translator that takes in latex source code and create a power-point like text output, one png at a time, rather than a whole pdf of powerpoint/document at once. Ocean created a png-to-animation module that examines the png for where the black pixels are, and determines a trajectory for the lecturer's figurine hand to "draw" over it, revealing each stroke of each letter as they pass over it. Joanna attempted to use MATLAB to extract the speaker's face from the video feed, which will then be attached onto the figurine's head as it moves across the virtual lecture hall writing its equations on the blackboard.

Challenges we ran into

Unfortunately we didn't have enough time to finish the project, as the workforce were spread too thin working on ideas that didn't work, e.g. the MATLAB implementation was proven futile due to hardware issues. But in the end we created the core component: the animation of the text revealing stroke-by-stroke

Accomplishments that I'm proud of

Over a single night, I(Ocean) came up with a general algorithm that can be used to "draw" any line based picture, using an idea that I came up with on my own: the black pixels are like trees on a plain, so that mask that initially lies on top of these black pixels are wiped away much like a wild fire burning across a forest.

A "fire" will be started by igniting the top-left most black pixel, and will spread across to its nearest adjacent or diagonal neighbour by applying a convolution with a 2D square kernel on this filled-in area.

The fire may end up having more than 1 fronts as lines in the picture branches out. To accurately simuate handwriting, there can only be 1 active front that's moving forward at a time. Therefore 1 front is chosen as the active front while the rest are declared dormant, and won't continue to advance unless the active front dies off.

This creates a very nice result of emulating hand-writing with a general algorithm, as seen in our videos.

What's next for Animated Lecturing

With more time we can fix the hardware issue that's stopping us to use the camera; then we can animate the lecturer's face onto a figurine as they speak and tap on the presentation slide, and move their virtual figurine hand over the text to "write" the LaTeX generated expressions onto the virtual blackboard background.

Did we enjoy it?

Absolutely! It was a fantastic experience to collaborate with strangers and friends alike on an interesting topic. Each of us has done our best to contribute to the project, and have walked away with no regrets.

Media

More demo files can be found on our github page in the video/ folder. Spefically eq4.mp4 to eq6.mp4

Built With

Share this project:

Updates