An Accelerated Learning Solution Platform for Online Classes was developed with a single goal in mind: to help students learn and stay motivated in an online learning environment.

According to a study conducted by Pearson in 2014, more than six in ten college and high school students agreed that tablets and technology help students to study more efficiently and perform better in class. In the age of technology, virtual engagement and the use of digital devices for reading, taking notes, and other school-related activities has already been on the rise and is considered a game changer in learning, particularly when it comes to improving student engagement. Now more than ever, we have the opportunity to leverage these platforms to make students feel more connected and focused, and to help make distance learning a smoother and more enjoyable experience.



The inspiration for this project came from all our group members feeling the impacts of the Covid-19 pandemic and all of our classes moving to an online learning environment. All of the struggles we mention, are struggles we have all personally faced, especially since some of us deal with ADHD and paying attention to long video lectures.

Students the world over who are struggling with distance learning, and especially students who have ADD, ADHD, trouble concentrating, and easily lose focus will greatly benefit from a platform that leverages several technologies to increase engagement, boost attention span and productivity, and track progress over time, making home learning a smoother and more enjoyable experience.

Our purpose is to help students learning from home by providing productivity-boosting technology. In, students are able to input videos directly to our website, pin specific points of the video, write notes directly on the platform, and track their own engagement over time with the help of facial recognition and sentiment analysis. This analysis is then graphed to show the student where they stopped paying attention and highlights gaps in knowledge.

Project Features & Technologies

  • Webcam-based facial recognition with our very own statistical index to track user engagement. API built with Rust, Rocket, and with the Azure Face API.
  • Full workflow to transcribe a YouTube video to text using a single URL as an input, complete with timestamps. Built with C# and integrated with Azure Speech to Text API.
  • Text summarization of a YouTube video transcription to condense lengthy lectures. Built with Python.
  • Keyword highlighting of a summarized video transcription to assist users who want to learn more about topics mentioned in the video. Built with Python and integrated with Azure Text Analytics API.
  • Online workspaces for lectures, complete with video transcriptions, transcription summaries, keyword highlighting, engagement scores, and timestamp-based notetaking.
  • Fully online and accessible through a web browser with no downloadable executables required. Data for facial recognition is not stored and is discarded after calculating engagement scores for a session.
  • Deployment environment managed by Docker.

Blake's Engagement Index (BEI) - Maths Breakdown

All the mathematics that went behind calculating the engagement index from our facial recognition feature.

BEI uses four features, scaled by individual weights, to produce an engagement score where y yaw and p pitch are normalized/adjusted head positions, s smile is a normalized smile value, and e emotion is a normalized, logarithmic value for emotion. Engagement score
Lower bound l and upper bound u are yaw/pitch values indicating head position when looking at the edge of one's screen from a webcam. If one is looking at their screen, their engagement score will therefore be higher than if they are not. C is a constant to adjust the sensitivity of the normalized yaw/pitch values. Head yaw normalization
Head pitch normalization
Emotion normalization graph
e, or the emotion parameter, incorporates attention to one emotion in particular called "neutral" (n). It assumes if you are not very neutral, you must be emotional, and therefore are likely to be engaged. However, more value needs to be placed on the lower end of the neutral parameter because it is where you are likely to be emotional. To do this, as shown below, we take advantage of logarithmic properties. Emotion normalization

The Team

  • Liana Severo
  • Joseph Orlando
  • Justin Bang
  • Blake Wyatt

What's next for

  • The goal for this hackathon was focusing on the complete back-end infrastructure to run the application, which we have successfully completed. Next, the front-end requires further development to complete the integration with our APIs.
  • The next step for is to build a platform where teachers can upload their own videos and send them directly to their students.

Built With

+ 4 more
Share this project: