Emotions are one of the important aspects of our life. Being aware of people’s emotions whom we interact with has a big impact in our everyday social communications. Facial expressions provide emotional information and without vision it is impossible to interpret facial expressions.

Our mission is to remove this barrier for people with vision loss. To make this possible, we developed a new application based on Google Glass camera and Affectiva Emotion Aware technology. Visual information is transferred from the Glass camera into the application. Emotions are detected with Affective SDK. Then the detected emotions will be whispered into the blind person's ear through Google Glass. The app is activated by a voice command to the Google Glass.

As our future plan we could combine this system with Pavlok’s haptic feedback system to recognize the intensity of the emotions. Another future plan could be using BeyondVerbal technology to enable emotion awareness for deaf people.

Built With

Share this project:

Updates