Aura, teaching emotion recognition through computer vision.
Having grown up with parents in the education field I was often exposed to some of the difficulties that those with autism encountered. With more research, our team discovered that part of these difficulties includes a condition called Alexithymia which inhibits people from understanding emotion and effects 10% of the population. Research showed that people with Alexithymia can learn to understand emotions but they aren't usually comfortable with eye contact. We wanted to create an experience that could help children who have Alexithymia learn and understand emotion.
What it does
Aura is a clean modern web application that is easy for users to learn and understand. We used our team's interaction design experience to create a UI that is easy to digest and navigate. Our product allows users to use their camera to capture a picture of someone. The app then uses a friendly looking mascot to present the user with a sentiment analysis of the subject. Users can use our application to explore emotions and learn the intricacies of facial expressions.
How we built it
We planned, designed, and prototyped our idea of using research to support our decisions. Once we were happy with our product we moved on to the development phase.
We created a website to function as our POC. We used HTML,CSS,JS to create our front end and used JS to connect our site to the Google Cloud Vision API.
Challenges we ran into
We ran into many development-related challenges during this event. The main challenge that we faced involved our team lacking JS development skills. We had trouble figuring out how to call the API and again when we attempted to send pictures that were taken by our application to an external server.
We tried training our own CNN model using VGG 16 model trained on ImageNet datasets. Our training was computationally expensive and was not feasible when considering the timeframe of the hackathon.
Accomplishments that we're proud of
We are proud of our entire team's efforts in this hackathon. We overcame very large obstacles related to languages that were inexperienced with. We are very proud to be able to show off a live working demo of our solution.
What we learned
We learned about Google Cloud Platform and how to push, pull, and parse data using APIs, JS, and JSON. We learned a lot about front end development and backend JS development. Indirectly we learned about training machine learning and neural networks using python. We also learned how we can improve our process using frameworks such as Vue.js.
What's next for Aura
We have many plans for Aura. Accessibility is something that we share a passion for so we would like to add options to increase legibility, and possibly add screen reading for those who can't read.
Some other features that we are considering adding include Emotion guessing mini-games that feature pre-composed groups of photos. Users would be shown a photo and asked which emotions are being displayed in the image. This would help users learn to differentiate different emotions and the facial expressions that go along with them.