Alexithymia, the inability to read emotion in faces, occurs in 10% of the population while only 1.1% of the population has autism. So the inability to read emotion is far more common that most people know and extends well beyond those on the autism spectrum. Even for those who can read emotions, maybe looking at these altered photos will make the habitual Facebook scrolling process more humorous and/or pass by more quickly.

What it does

Senses emotions in people's Facebook photos and overlays their faces with emoji that are known to be mapped to certain emotions so that people who are unable to read people's emotions can better understand their friends.

How I built it

I made a Google Chrome extension using Javascript and the Microsoft Oxford Cognitive Services API to decide which emoji to use.

Challenges I ran into

Javascript debugging, lots of Chrome issues

Accomplishments that I'm proud of

What I learned

I learned how to make a Google Chrome extension and use a new emotion analysis API.

What's next for Emojify

I would like to replace objects (other than just faces) with their respective emoji in the future.

Share this project: