Maia is a web application that has reinvented remote learning. Instead of hosting online classes through the traditional layout where participants appear in separate boxes, Maia brings the classroom to life by creating live animated versions of teachers and students. This design brings the liveliness and human connectivity of a traditional in-person classroom to virtual classrooms across the country so that students no longer have to sacrifice the quality of their education when they are learning online.
Purpose and Motivation
We are two sisters who were first inspired to create this product by our mom, a high school teacher for the last 30 years. Despite her best efforts, she has been struggling to get through to students during the COVID-19 pandemic. Remote learning creates a dull, impersonal environment, and even as our mom is screaming, singing, and dancing in front of her camera, her students are unengaged and unable to learn.
There are millions of teachers across America facing the same problem as our mom. Researchers estimate that the shift to remote learning caused students to lose about a third of expected learning, which equates to about $133 billion lost of the U.S. public education budget. This major setback in learning is especially detrimental to primary school students, as they are the most vulnerable learners. Disruptions to their expected learning now could have a lasting impact on their long term growth and development. This fact motivated us to build a better remote learning solution, so that the 1.5 billion students who have been impacted by COVID-19 may continue to learn through these unprecedented times.
How this Application Works
Maia is an animated virtual classroom designed for elementary students to enhance engagement and interactivity in online learning. Teachers and students can create customized avatars that will mirror their facial expressions as they speak. The virtual classroom includes a collaborative whiteboard where teachers can write, display slideshows, and selectively allow student participation. Maia is a fully gamified experience, as students are awarded points for participation and attention at the completion of each lesson. Using Maia’s user-friendly dashboard, teachers can manage assignments, schedule lessons, and track student progress.
How this Application was Developed
The first step we took to build our alpha prototype was to research and write the code for the facial tracking and motion capture tool. This was implemented through computer vision based sensorless motion capture, meaning that users can use an ordinary camera and no sensors are required. After the face is detected, prominent parts of the face are identified using the Histogram of Oriented Gradients (HOG) method to extract features from pixels. After the features are identified on the human face, the corresponding features on the user’s avatar are linked and will move synchronously. More information regarding the technical implementation of this project can be found here.
Once this tool was working, we choreographed the facial movements of our animated character by pairing the motion capture tool with Adobe Animate. Next, the character’s legs, arms, and torso were animated by meticulously moving each body part throughout each second of the clip. Although choreographing the motion of the character's body parts was a very manually intensive process during the alpha prototype creation, this process will be automated in the final prototype of the software by using natural language processing (NLP) and sentiment analysis to determine the content and tone of the character’s speech.
After animating the character, the user interface of the web application was designed. The last step required to make the user interface was to design the interactive whiteboard tool. We conducted research regarding how to maintain consistency on a drawing tool when multiple people are simultaneously editing. We also found an open-source API, BigBlueButton, which offers an interactive whiteboard as well as a video conferencing tool. We will use this API in our prototype development.
How to Use this Application
In order to use this application, teachers and students must all create accounts through Maia’s website. Next, teachers can set up classes by sending a class code out to all their students and using Maia’s calendar tool to schedule a recurring class time. At the beginning of the class time, teachers and students will log on to Maia’s web application and join the classroom. Teachers can choose to begin writing on the whiteboard or to upload a slideshow to the whiteboard to present. Students can click their character at any time to raise their hand. Teachers may start a poll, quick, or collaboration at any time. If the teacher chooses to start a collaboration, students will be grouped into breakout rooms where they can have smaller discussions. Once the class is over, the teacher can end the meeting and students will receive a pop-up window indicating how many points they received for their attention and participation in class. They can use these points to buy accessories for their avatar from Maia’s virtual store. Teachers can use the student dashboard to see an overview of their students’ progress, participation, and attendance in each class.
Difficulties & Challenges
The biggest challenge we faced was our tight constraints on time and labor. Since we had a team of only two people, we quickly realized that we had to narrow our scope and focus on the most important aspects of our design.
Additionally, we ran into several challenges while trying to create the facial tracking and motion capture tool. Although we had some experience writing facial recognition code in the past, capturing motion and translating that motion to an animated figure proved to be difficult. We overcame this challenge by thoroughly researching the technical aspects of the development of various computer vision algorithms. Beyond online resources, we contacted experts in this field to learn how they created such projects. Whereas last week we had nearly no knowledge of motion tracking beyond facial recognition, today we are close to the implementation of facial tracking and motion capture in Maia.
Go-to Market Evaluation
Maia’s initial target market is fully online private and charter elementary schools of which there are 38,000 schools that serve 2.75 million students in the United States. We will then expand to fully online public schools of which there are 99,000 schools that serve 50.1 million students in the United States.
We will market this to customers by simultaneously using a “bottom-up” and “top-down” approach. The bottom-up approach consists of first targeting teachers by offering a month-long free demo. Once we have a strong following and market validation through the free demo, we will use a top-down approach to begin targeting schools and school districts in regions where teachers are most passionate about the product. This strategy was chosen since there is a dire need for the product among teachers, which will create a sense of fanaticism around our product. With a strong base of support, it will be much easier to pitch the product to investors and to eventually sell the product to schools.
Our product will be sold at the teacher level, the school level, and the school district level. Teachers can buy access to Maia for their own classes by paying $200 annually. The individual teacher version will not include any analytics or management features. The subscription fees for schools or school districts are based on a tiered model that is dependent upon the number of students being served. This model will be designed so that the software costs an average value of $40 per student per year.
Considering the price per student is $40 and we first target fully online private and charter elementary schools that serve 2.75 million students in the United States, in the first year the original target market will be worth $110 million. If sold to 165,000 students, our total earnings would be $6.6 million, with a 6% market capture. After 3 years we will expand to target online public schools, resulting in the market to be worth $2 billion. If sold to 10 million students, our total earnings will be $400 million, with a 20% market capture.
As we scale, it will be vital to consider the widespread reliability and security of our product. To achieve this, we began interviewing potential future technical co-founders with expertise in site reliability engineering. We will host our service using Amazon Web Services (AWS) and we will use Unity Real-Time Development Platform to host the live animations. Overall, ensuring a safe interface with high quality interactions is of the utmost importance in the successful scaling of our product.