We wanted to work with some of the new and unfamiliar hardware that was made available to us, so we chose Google Glass. Emolytics was our method of providing relevant analytical information about the personal sentiments of others on a much more accessible platform. The application can be used on a small scale to determine how a conversational partner is feeling, or on a larger scale to gather the general attitude in a larger group to analyze later using our browser application.
What it does
Emolytics is an application built for Google Glass that can take a picture of somebody and, within seconds, provide an emotional analysis of that person. The application is also paired with a WebApp that visually displays the most common emotions captured recently.
How we built it
We built the application using Google's glass development kit within Android Studio. For emotional analysis, we used Microsoft's cognitive services API. We have a backend written in nodejs and express, with MongoDB storing persistent information. This backend is hosted on Microsoft Azure servers.
Challenges we ran into
The response time between first sending a picture and receiving a response from Microsoft's API was initially far too long. Getting a working debugging setup in Android Studio was also difficult.
Accomplishments that we're proud of
Learning a brand new type of hardware with very minimal Android Studio experience.
What we learned
How to host a backend using Microsoft Azure, how to utilize Microsoft's cognitive services API, how to develop applications for Google Glass.
What's next for Emolytics
Live and more persistent image analysis. Data analytics using the emotions captured in the database.