Last year, I tried to make an app that generated emojis from face emotions. I struggled with face landmarks and a SVM but it was not working. This time I decided to use deep learning and computer vision, with the help of Benjamin Planche.

What it does

It's an app that detects emotions on iOS, Android and in the browser, in real time! I built this project to showcase the full pipeline of on-device ML, from model training to app shipping!

How I built it

It's using:

  • TensorFlow 2 + Keras for training
  • TensorFlow Lite + Google Face Detection for Android
  • Core ML + Face detection for iOS
  • TensorFlow.js in the browser

Challenges I ran into

Making native face detection API work well with TensorFlow Lite was a bit hard, especially for debugging. I also had to fork tf-coreml to make it work for TensorFlow 2.

Accomplishments that I'm proud of

It runs well and in real time on all devices!

What's next for Run TensorFlow 2 on Any Device (Emotion Recognition Demo)

Run on Edge TPU

Built With

  • core-ml
  • keras
  • tensorflow-2
  • tensorflow-lite
  • tensorflow.js
  • tfjs
Share this project: