1 in 68 children has ASD in the United States. Children with autism spectrum disorder (ASD) often find it hard to recognise and control emotions, but they have the potential to improve their skills in this area. Therefore, we built HowAmIFeeling AR, an educational AR game which aims to help children with ASD to interpret emotions and thereby understand and respond more appropriately to other people.

What it does

HowAmIFeeling AR is an educational game that trains children with autism spectrum disorder (ASD) to recognize emotions. A training session is conducted between an individual with ASD and a trainer. During a session, for multiple rounds, the trainer describes what they are feeling, and based on the trainer’s facial expressions and language, the child with ASD needs to choose a certain emotion from “happy”, “sad”, “neutral”, “surprise” and “angry”. The child’s selections are then compared with the normalized results generated by Microsoft’s face API and text analytic API. Results are sent to a server hosted on Microsoft Virtual Machine to be processed. After a training session, a detailed report with analysis of the child’s performance will be sent to designated email addresses. We decided to implement the game with AR since children with ASD have difficulties identifying and focusing on people who are communicating with them. When the child turns around, the game will tell them to turn back and face their trainer. An AR environment also the game more interactive and fun.

How We built it

Client-side: We built and animated all the models in maya, and exported them to xcode. The sentiment analysis was achieved through Microsoft’s Face API and Text Analysis API together with a custom weighting algorithm. Using Microsoft’s Language Understand Service’s Intent API, we were able to integrate natural user commands into the app. Augmented Reality is achieved through ARKit, and speech to text is achieved through Siri.

Server-side: We hosted Node.js in a Microsoft Virtual Machine instance as backend, and used Azure SQL to store historical training data. We created algorithms to analyze the data passed from client, and generated an extensive report with texts and charts and send that to designated email through Microsoft SendGrid Email API

Challenges we ran into

Integrating Microsoft APIs in Swift was difficult, since there were no example codes. The process involved a lot of painful testing and debugging with http requests. When we tried to integrate SendGrid Email API, the newest version does not have a documentation yet, and we cannot find any related code snippets. So we chose to roll back to older versions. Since we had a front-end as well as 2 back-ends, a challenge was to integrate everything and make them work well together

Accomplishments that we are proud of

We find it amazing how we were able to distribute tasks to every team member, and integrated everything in the end together. Working with Microsoft’s API in swift was also an interesting experience. It was also our first time making animations in Maya and then integrate them into ARkit.

What we learned

After working with a comprehensive list of microsoft APIs, we really learned how different services and endpoints connect and work together as a whole. Since each API has its own pros and cons, through building our project, we learned how to balance the weights of different API results to create a better user experience. We have also gained a better understanding of ARKit and the yet untapped potential of Augmented Reality

What's next for HowAmIFeeling

  • User accounts
  • Interactive data reports
  • AR/VR hardware integration such as Hololens or Oculus/Vive

Built With

  • microsoft-text-analytics-api
  • microsoft-face-api
  • microsoft-language-understanding-api
  • microsoft-azure-sendgrid-email-api
  • microsoft-azure-sql-database-server
  • microsoft-azure-virtual-machine
  • microsoft-sentiment-analysis
  • arkit
  • uikit
  • autodesk
  • siri
  • node.js
  • swift
Share this project: