It is commonly understood that equal access to education is the key to progress, however the quality of education varies greatly depending on socioeconomic factors, in particular for primary and secondary students in the UK. This is primarily because of limited classroom resources which leads to a lower quality of teaching due to over-worked teachers. To add to this, every student has a unique learning style and some have learning and/or hearing disabilities. Traditional teaching methods are unaccommodating of these challenges and being in a large class only exacerbates this as students requiring extra support are likely to get left behind. The goal of our hack was to create an inclusive classroom tool that would improve the education experience by providing lesson planning and feedback as well as encouraging student engagement through customisable lesson experiences.
What it does
The teacher's Class.IO web app provides a live transcript of the lesson visible on the teacher's lesson dashboard. This feature was designed for hearing impaired students and also as a general lesson content reference for students who may have not understood the concepts when they were first explained. The dashboard also gives the teacher live feedback of the average sentiment of the class with a keyword such as 'confused' or 'happy' helps the teacher identify topics that have not been well understood by the class and provide extra resources or improve their explanations accordingly. 3D animations can be designed in the animation making feature and deployed to the corresponding student apps for teachers to explain difficult to understand concepts such as mechanics.
How we built it
The backend of our web app serves as the platform where all of our services connect. This is implemented using Django with Angular for the frontend. The live transcript uses Google Speech to Text which is communicated to the front-end via a TCP connection. The augmented reality simulations were made using the Google AR library. The student's app is partially implemented on iOS and Android, whereby, the interactive simulations are currently available on Android but the rest of the app is implemented in Swift. To compute the average class sentiment, images are taken at regular time intervals from Cisco's Meraki and then passed to AWS-Rekognition where facial recognition and sentiment analysis is performed.
Challenges we ran into
Accomplishments that we're proud of / What we learned
Creating a complex network of micro services deployed on a web app in just 24 hours!!! Particularly because they each began as entirely separate entities. We used around 10 APIs so we learnt how to deal with complex data flows. We gained a massive appreciation for cloud computing: it was our first time using AWS and GCloud services which was very exciting and rewarding experience that allowed us to do a lot of cool stuff. Full stack development was new to most of our team members and we're proud to have produced a working demo. But of course, the most important thing we learnt is how to stay up all night without becoming zombies! Kinda.
What's next for Class.IO
What's NOT next for Class.IO?! We jest. Implementing a fully functional student app that allows them to view the subtitles in their own time would be the priority. The sentiment analysis was intended to be for each student so that a teacher could identify which student was struggling. It would be great if this could be implemented as well as mapping the class sentiment to the transcript so when reviewing the lesson the teacher can easily see which areas were generally understood well and which were not. Furthermore, we think there is a possibility to adapt the web app for use in developing countries to enable students to access a higher quality of education through an online class.