While we were talking about how we could use the available APIs to build something interesting, we stumbled on Microsft's Congitive Emotion API and thought we could use this to enrich web experience (videos, chat and pretty much everything else).
What it does
Our product detects facial emotion during chat and video, allowing for a richer online experience.
Text is often criticized as impersonal. We decided to solve that by detecting spontaneous emotions to chat messages. Emotional responses are colour coded, providing direct feedback to someone about their friend's reactions.
Our video plugin for chrome detects emotions and associates it with the user's progress in watching videos online. This provides content creators with feedback about which parts of their videos are interesting / funny / controversial.
How we built it
The video plugin was written as a chrome extension. This allows it to access videos running in user's tabs, as well as interface with the user's webcam to provide for the parrallel running of the emotion capture tasks.
The chat application was built with node.js (+ html, css), and hosted in cloud9. We also used socket.io for talking between users and bootstrap for aesthetics.
Challenges we ran into
The two significant challenges we ran into were sending data of the right format to Microsoft as well as handling when to request for emotion calls.
Accomplishments that we're proud of
We managed to correlate video timestamps with the user's real time emotions. We also managed to engineer from scratch our very own chat interface to allow for the integration of the emotion API.
What we learn
This hackathon was a very good excercise with using a machine learning API: we learnt about the value of using already-developed technologies like the Microsoft Cognitive suite in building fun and promising applications.
What's next for VEmotion
We all liked the app we developed and thought that it expresses our views about how web experiences can be bettered.