Inspiration
It is a great way to improve your skills or learn coding from scratch using online learning platform and online coding challenges. It helps you to learn the intricacies of a programming language or even prepare for job interviews. But as we all know, it is single direction learning experience, students may be easily distracted throughout the online tutorial.
That comes a question to our mind that, what we can do to boost up the learning outcome by providing interactive learning experience? So we, coming from Hong Kong Institute of Vocational Education, have built this project "Miss MAH" which uses AWS AI Services, serverless architecture, Amazon Sumerian services and also gather data to understand what students are doing during the online course, provides a bidirectional learning experience.
What it does
"Miss MAH" can be applied to all IT related courses. It is just like a virtual tutor, it can see, can hear, can think also can speak. It provides 3 main features :
- Miss MAH's Eye
- Miss MAH's Brain
- Interactive Miss MAH
Miss MAH's Eye [Hacker]
This is a Python application running locally on a student’s computer, in which sends required information to AWS periodically. It includes the functions:- Capturing all keyboard and mouse cursor events – Ensure students are really working on the exercise as it is NOT possible to complete a coding task without using keyboard and pointer!
- Monitoring and Controlling PC processes – Stop students from running programs that are NOT relevant to the tutorial. We can kill all browsers tabs and communication software, when test is taking place.
- Capturing Screens - Detect videos or content that are NOT appropriate by Amazon Rekognition . The extracted text content is able to trigger an Amazon Sumerian host to talk to a student automatically if certain circumstances match (e.g. watching porn / movies during lab exercises).
- Uploading source code to AWS when students save their code – Review all participants' performance and advice to students who are difficult follow the tutorial exercises.
- Capturing all keyboard and mouse cursor events – Ensure students are really working on the exercise as it is NOT possible to complete a coding task without using keyboard and pointer!
Miss MAH's Brain [AI]
It collects the captured data and provides an API to Interactive Miss MAH. Students' programming exercises will be marked immediately every time they save code.
All data will be constantly saved into an Amazon S3 data lake and Amazon Athena can be used for analysing data.Interactive Miss MAH [MR - Mixed Reality]
This is a Amazon Sumerian application that can alert students to work on their online tutorial exercise.
It sends a camera image to Amazon Rekognition and gets back a student ID (A live camera showing on the left projector screen).
A Sumerian host, Linda, uses Amazon Polly to speak to students when following situation happens:- Students Appeared in the camera screen, Miss MAH says greeting with him and take attendance at backend.
- Students Passed a unit test, Miss MAH says congratulations.
- Students Watched any movie, Miss MAH scolds them with the movie actor’s name, such as Chow Yun Fat.
- Students Watched any inappropriate content, Miss MAH scolds them.
- Students Did something wrong such as forgetting to set up the Python interpreter, Miss MAH reminds them to set it
- Students Appeared in the camera screen, Miss MAH says greeting with him and take attendance at backend.
Students can also ask questions to Miss MAH, for example, to check their overall class/training progress. The Amazon Sumerian host is able to connect to a Amazon Lex chatbot. Then the conversations with students will then be saved in DynamoDB with the Sentiment Analysis result by Amazon Comprehend.
As you see, a student's screen also his/her face from front camera will be shown as projector screens inside the Sumerian application.
Miss MAH: “Stop, watching dirty thing during Lab!”
The scene is built based on an actual classroom in Hong Kong Institute of Vocational Education (Lee Wai Lee) - Cloud Innovation Centre.
View1 - Real:
View1 - 3D:
View2 - Real:
View2 - 3D:
How we built it
The original model scene is built with Maya.
The following shows a Simplified Architectural Diagrams:
Challenges we ran into
To capture the following user behaviors by:
- Capturing all keyboard and mouse events
- Monitoring and Controlling PC processes
- Capturing screens
In order to get nearly real-time view of all student’s activity and analysis.
To project students's screen into Amazon Sumerian.
To make the interaction between users with Miss MAH naturally, and do Sentiment Analysis for all student’s conversation and transformed to text based. Text will then be easily being extracted and analysed
Accomplishments that we're proud of
By combining the power of Amazon Sumerian and the support of other AWS Services, students will be encouraged to concentrate on their lab exercise, due to the real time interaction and encouragement by Miss MAH, that will be much fun for them to learn IT.
What we learned
The integration of Maya models with Amazon Sumerian
Deep Dive into the Amazon Sumerian scene adjustment
What's next for Miss MAH
In future we plan to In future we plan to enhance the movement of Miss MAH that will be throughout the whole virtual classroom, also paying effort to build machine learning model to predict users/students’ final grade against their class behavior.
Built With
- amazon-polly
- amazon-rekognition
- amazon-sumerian
- amzaon-comprehend
- api-gateway
- athena
- codecommit
- cognito
- dynamodb
- glue
- kinesis-data-analytics
- kinesis-stream
- lambda
- python
- s3
Log in or sign up for Devpost to join the conversation.