There are no applications that fully utilize the functionality of the Microsoft Cognitive Emotion/Face APIs for qualitative and quantitative analytics in education and healthcare. For example, suppose you are a doctor, knowledge of your patients mood could help you better serve and monitor your patients well being. Additionally, teachers could use this technology to analyze students emotions and faces to help identify lack of progress for certain students, whether from disabilities or problems outside of school. These APIs can have a powerful, positive, impact in both of these industries or fields that work to enhance our everyday life.
What it does
Everything but nothing. Lets say there is a child with some type of learning disability such as autism, this application can help the teachers to identify children who aren't engaged due to learning disabilities.
How we built it
Utsav designed the UX and wireframes using adobe illustrator. The 'front-end' of the apps was build with Android and iOS. The back-end was built using firebase and cloudinary. Hammers, nails, coffee.
Challenges we ran into
Microsoft API didn't support Swift. Lack of details in documentation on Microsoft Cognitive service APIs. Cloudinary didn't directly support Swift. Setting up an iOS application.
Accomplishments that we're proud of
The amount of great ideas we were able to generate. Our abundance of creativity and vision was almost a hinderance.
What we learned
We have enough ideas for 20 teams. APIs for many different languages and frameworks.
What's next for EmotionAnalyzer
Series B funding. More caffeine.