Around 1 in 54 children are diagnosed with autistic spectrum disorder. Children on the spectrum often have difficulty with recognizing social cues such as emotions or expressions. I decided to make an app to assist them in learning to overcome this challenge.
What it does
Emocia uses a CoreML machine learning model for the app to recognize different emotions. The user can point the camera at a partner to show different expressions and let them guess what emotion the partner is displaying. Emocia has an AR assistant at the top right corner to give hints towards what the emotion is. This is important because the visual nature of AR produces more curiosity and engagement for those with autism spectrum disorder. Once the user is confident, they can select the emotion they believe their partner is showing and if the answer is correct they gain five points.
Challenges we ran into
The machine learning model I used still needed more training to improve accuracy. Because of this, it was difficult to pinpoint the correct confidence level before deciding if something is an emotion. However, I was eventually able to find a confidence level that ran somewhat smoothly.
Accomplishments that we're proud of
I'm proud of being able to integrate augmented reality with machine learning to really enhance user experience. This was the part that took the longest time, from creating different 3D models for the assistant to making it work in augmented reality with ARKit.
What's next for Emocia
I plan on improving the accuracy of the CoreML model and training it with more data so that it can function at a greater level. I also plan on improving the animation of the AR assistant.