We wanted to show off the new capabilities of computers to detect human emotion

What it does

It interfaces with the IBM-Watson and Microsoft-Oxford API's to detect facial expressions and tone in sentences.

How we built it

We developed using phonegap to build for the web browser, IOS, and Android all at once.

Challenges we ran into

RESTful API's can be challenging.

Built With

Share this project: