Inspiration

It’s well known that Facebook can infer much of us from our social media activity. Yet much of us don’t react to everything we see. I propose to create a machine learning model that takes a picture of our face while we are on social media so it can better understand how we feel even when we don't engage with the platform

What it does

How I built it

I modified the Transfer learning tutorial from pytorch https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html

With a Face dataset from kaggle with the 7 basic emotions to retrain the image model.

Challenges I ran into

The framework learning curve (at first),

Accomplishments that I'm proud of

60% accuracy with only 50 epochs, trained in over 3 hours in collab, yeah!!!

What I learned

Pytorch is fun and easy to use, I would like to see how easy it is to implement a production environment.

What's next for Pytorch Emotion Classifier

Create and app that records the emotion while scrolling on facebook, store the posts that the user sees and the emotion associated to that post.

Notebook: https://colab.research.google.com/drive/1lS7uaeLFRcKzpaUfse7cIMcBKKzycepN

Model: https://drive.google.com/file/d/1YoBb3QtYEeBd5xwJ18oZAYZFmFbOJzAg/view?usp=sharing

Dataset: https://drive.google.com/file/d/1_4gjgm8XMRwTKc-z0yaqhAVq0tDB_QV2/view?usp=sharing

Youtube Video: https://youtu.be/ILd8C1v7rPU

Built With

Share this project:

Updates