Inspiration

After learning about BERT, I wanted to try my hand at implementing a simple Sentiment Analysis script using PyTorch and HuggingFace's Transformers. When I saw this hackathon I decided to add other Transformers models such as AlBERT and DistilBERT to it as well as change the whole code from beginning to end to make it more readable.

What it does

We can put an English phrase, and it'll predict whether it has a positive or negative sentiment. It'll also let us know its certainty of this prediction. This can be done directly on the command line or with the client&server implemented using vue.js&flask.

How I built it

It uses Stanford's Sentiment Treebank as the dataset, which has movie reviews and labels in the form of postivie or negative. Thanks t PyTorch I was able to easily create a dataloader for this as well as implement training and evaluation loops. I also used HuggingFace's Transformers for the 3 different kinds of transformers I used.

Challenges I ran into

It was hard to figure out how to upload a model I myself trained, so that people could demo my implementation by just downloading my model without having to train theirs.

Accomplishments that I'm proud of

After sharing my repository on LinkedIn it quickly reached 100+ stars, as well as on the model's page ( https://huggingface.co/barissayil/bert-sentiment-analysis-sst ) I can see that it's already been downloaded 3000+ times! So it means some people are actually using it, which is amazing.

What I learned

Implementing AI models and sharing them with people is fun!

What's next for Sentiment Analysis

I'll add other transformers such as T5. I will also do some experiments with the multi-lingual version of BERT. I already tried it on my own and say that by just training it in SST, which is in English, it already has a pretty good performance in French (but not in Turkish).

Built With

Share this project:

Updates