I recently had to use a captcha and was frustrated by how many tries it took me to prove that I wasn't a robot. Then it hit me, why are we looking at squigly text, what's more human than emotions? I had just looked through indico.io's list of apis and saw that they had a facial emotion detection api and it looked pretty doable to implement, so I thought why not create a proof of concept for the first emotional captcha.
How it works
Emotcha shows a user an image of a human's face and the emotional profile of the face. Next to that is a live webcam view of the user and a stream of their emotional profile. Once the profiles match, the user is able to login to the mock website.
Challenges I ran into
The indico api expects just faces, so I had to implement live facial localization into the algorithm. Doing this while also fighting to keep the webcam view as smooth as possible was quite challenging
Accomplishments that I'm proud of
I'm proud of the general user experience of the proof of concept, that it just works.
What I learned
I learned how to implement a variety of powerful tools, indico.io's apis, materialize's beautiful css framework, and facial detection
What's next for Emotcha
Next is to make it responsive (cause it really isn't, curse of being a backend guy) and perhaps package it up on npm so that other developers could actually use it in the future