What does it mean to be human? What differentiates humans from objects? How do you build better relationships for better shared experiences? Why don't computers learn who you are over time and invest the same energy we do in our relationships with people + objects.
What it does
Eva learns who you are and keeps a "memory bank" of experiences, sentimental history, and visual/audio/text data from the user. This information helps objects in the IoT space work with you and understand your identity and proactively changes based on this "shared" history of you that is built by everyone involved.
Eva works by learning your speech, converting it to text, understanding the sentimental data behind that, and saving it to your "shared" profile.
Eva also captures data from a video sources and makes predictions about how you feel visually and saves this in your "shared" profile.
A jilia board connected to the raspberry Pi receives this information, makes changes in the physical environment as a result. For example, if you're sad, a red LED would show on the board, and maybe a music service can be connected to play uplifting music. Physical objects know understand and develop an "Emotional Intelligence" based on this profile and it can be carry forward for any number of tasks by IoT devices.
How we built it
We built the software with Node.js tied with Watson running on BlueMix. We use webcam.js to pick up a history of visual data and use Microsoft Facial Emotion API to understand the sentiment for that picture. We also use Watson speech to text combined with Alchemy APIs for speech sentiment data. Everything is pushed to Cloudant for creating a "shared" profile of the individual across the platform.
Challenges we ran into
It is difficult to deploy apps with many services and APIs, scaling and sending data within the system can be tricky.
Accomplishments that we're proud of
Combined method for Emotional Intelligence for computers.
What we learned
What's next for Eva