Inspiration

Around 1 in 54 children are diagnosed with autistic spectrum disorder. Children on the spectrum often have difficulty with recognizing social cues such as emotions or expressions.

What it does

Emocia uses a CoreML machine learning model for the app to recognize different emotions. The user can point the camera at a partner to show different expressions and let them guess what it is. Emocia has an AR assistant at the top right corner to give hints towards what the emotion is. This is important because the visual nature of AR produces more curiosity and engagement for those with autism spectrum disorder. Once the user is confident, they can select the emotion they believe it is and if the answer is correct they gain five points.

Challenges I ran into

The machine learning model I used still need more training to remove accuracy so it was difficult to pinpoint the correct confidence level before deciding if something is an emotion.

Accomplishments that I'm proud of

I'm proud of being able to integrate augmented reality with machine learning to really enhance user experience. This was the part that took the longest time, from creating different 3D models for the assistant to making it work in augmented reality with ARKit.

What's next for Emocia

I plan on improving the accuracy of the CoreML model so that it can function at a greater level as well as improving the animation of the AR assistant.

Built With

Share this project:

Updates