In the age of digital learning that provides immense advantages of convenience and flexibility, we often tend to forget about the importance of basic human interaction. The absence of the immediate response and feedback results in the lower quality of the online courses, limited clarity and the mutual dissatisfaction of both students and educators. To bridge the gap, we are introducing seEmotion - a revolutionary system that allows learners to communicate their level of understanding with minimal effort through emotion that would aid lecturers to reassess their educational methods. By advocating for a humane approach in the virtual classroom's settings, we hope to help students and teachers in their mutual pursuit of high-quality renewed education.

What it does

This project provides mutual benefits for both students and educators, helping to improve the clarity of the material taught in class and thus ultimately elevating the quality and usefulness of the course. With the student's own permission, the authorized plugin will continuously capture their emotional response every 5 seconds throughout the lecture and examine it to assess a student's reaction to the presented material. Their facial expressions will then be interpreted with the help of Google's Cloud Vision API, detecting any possible confusion or muddle using the differentiation of the four main emotions - joy, sorrow, anger and surprise - with the search for the minimal parameters of joy and the highest scores in the sorrow, anger and surprise categories. The score is then rendered by our algorithm that processes the emotional interpretation and outputs coefficient of the student's confusion. This result is represented by the live time-updating graph on the instructor's screen, revealing the class' overall level of comprehension. By using _ seEmotion _, a lecturer would gain a better insight into the quality of their explanation, the adequacy of the learning pace and whether or not learners may require for the concept to be repeated for the students' complete mastery. The in-demand interactive technology can not only allow such progressive academic institutions as Stanford Online High School to better deliver the first-class education to its students but can also be used in the classroom, helping pupils to overcome social anxiety and lecturers to grow further in their professionalism.

How we built it

To tackle the limitations presented by the classroom’s settings, _ seEmotion _ was developed incrementally with a feasibility verification of each individual component. We have first installed Google’s Cloud Vision API services to work with the trained facial and emotional recognition models based on the Cloud Databases with a heavy AI implementation. We then proceeded to render the software using Python and its OpenCV libraries to interpret the images using the ordinary web camera. Differentiating each facial expression into the spectrum of four available emotions (joy, sorrow, surprise and anger), we constructed the algorithm that tackles the identification of the full absence of joy and the mixture of other three ones, returning the confusion index. With the time interval of 5 seconds, the index produced by the algorithm analysis is then graphed with the help of the Matplotlib, that is also translated in the live regimen to the prospective educator’s computer.

Challenges we ran into

For all of us, it was the first experience in working with Google's Cloud Vision API, and hence the implementation of the software and its integration in the virtual reality stood out as a big challenge for our team. Throughout the hackathon, our project’s development was adjusted accordingly to the technological practicality and potential users’ convenience: for instance, our idea of the video Livestream evolved into the continuous analysis of the pictures taken, easing both system’s load and the implementation. We also took an incremental approach on all the stages of the _ seEmotion's _ development, carefully considering such pivotal steps of the image analysis with an aid of the Python and Matplotlib libraries through the file integration, as well as miscellaneous challenges as the camera operation in a real-life regimen.

Accomplishments that we’re proud of

Despite the initial confusion and shy first steps in the _seEmotion’s _ development, we are generally pretty satisfied with our final result, that can be potentially developed into a full-stack feature plugin for the major online course hubs, such as EdX and Coursera. The initial lack of knowledge and experience was widely bridged by our persistence and unending enthusiasm throughout this project’s development. By choosing to explore such a challenging project as _ seEmotion, _ we truly explored our strengths and weaknesses through the extensive collaboration, polishing any issue we could encounter and assisting each other at every point of our progress.

What we learned

The eminent insight that we have gained as programmers during the _ seEmotion's _ development was the profound usage of the Google Cloud API’s services for the facial and emotion recognition in particular, using the AI algorithms and trained models that were generously available to us. Besides just elevating our Python skills, we gained a great share of experience in dealing with the imminent implementation of the files in the virtual reality with the detailed analysis as the output that we processed through our personally tailored algorithm. Overall, we greatly enjoyed experimenting with Google's emotional recognition intelligence and the vast stack of python labs that were available to us and there is no doubt that such depth of the technical sagacity would manifest in our future projects.

What's next for seEmotion

Despite being easily implemented in the web domain, we foresee the wider horizons for our project, that includes but is not limited to the possibility of the web plugin extension for the major educational websites and leading academic universities that would greatly expand the meaning of the online classes all over the world. We have also considered enlarging the _ seEmotion’s _ influence further by turning this feature into an app that would be compatible with the existing educational domains, expanding the access to an interactive remote education in practically any setting. In a wider perspective, we foresee great benefits from the integration of the seEmotion into the leading academic institutions and their online education, with the beneficiary ranging from Harvard Online Business School to the Stanford Online High School.

Built With

Share this project: