FaceIt

Inspirations

As instant messaging becomes ever so popular, the challenge of replying to multiple messages at once makes it difficult convey genuine emotions. So we decided to create a platform to allow us to better translate and preserve our emotions better during messaging.

What it does

By using pictures of the user's face, we use facial detection alorgithms to determine the user's emotions and translate that into text and emojis.

How we built it

We created a custom android keyboard, which allows your to take a picture of your face that gets sent to our custom built api. We used Node.js and Express.js to create a backend that used the Cortana Intelligence Suite's Emotion Recognition algorithm analyze the photos from the android device.

Challenges we ran into

The first problem that we ran into was with the Android Keyboard service. Android does not allow for us to display and use a camera in the keyboard, we had to hack around to solve the problem. Another issue was that the machine learning API has limitations on how we can use it in our app, which limited our options.

What we learned

We learned the intricacies of the Android API and, most importantly, the facial detection algorithms. It was really interesting to see how those algorithms worked and how they are embedded into different projects and systems.

What's Next for FaceIt

The next step would be more personalized emojis, as the preferences of emojis differ from person to person, so should the settings of this app.

Share this project:
×

Updates