In the smarter city of the future where everything is connected where it's "normal" to talk to a machine... we completely miss out on that emotional connection. For those who travel, it seems that the planet just gets lonelier and lonelier.

With Emotional Urban Advisor, you can actually talk to a machine that is powered with realtime computer vision to be emotionally responsive. It's like a pocket best-friend that can take one look at the expression on your face to find out how you feel.

For this hour hackathon during Amazon ReInvent conference, I made a simple tech demo with the iPhone X Face tracking ARKit SDK that detects when you're happy and sad. Alexa and Amazon Polly natural voice is used to then converse with you to suggest some things to do near you that make you happier (in this case, just using movies from Alexa and IMDB/Fandango APIs) - or, recommend a therapist that might help you debug your sadness.

Note: although I am using an iPhone X... Happy and Sad are actually simple emotions that pretty much any phone can detect with 99.9999% accuracy using OpenCV and dlib 68 feature point.

Built With

Share this project:

Updates