What it does

We have created an AI in the form of a Google Chrome Extension that generates articles from a header in the styles of a CNN or a Breitbart article. The goal was to take CNN titles and generate an article in the style of Breitbart and vice-versa, each trained on a dataset of Breitbart and CNN articles respectively.

Inspiration

We aimed to "unbias the news" floating around on the internet and to provide a great source of entertainment for the reader. We believe laughter is the best medicine and this project was our way to make the world a merrier and a more stress-free place. Our second aim was to neutralise politically biased news.

How we built it

For our generative AIs, we have used an AI-based text-generation model, GPT-2 which is based on the Transformer architecture and trained it on massive amounts of text all over the internet. The two AIs were further fine-tuned on ~25.000 Breitbart and ~15.000 CNN articles respectively. These were obtained from a database we found on the Kaggle website.

"From a text-generation perspective, the included demos were very impressive: the text is coherent over a long horizon, and grammatical syntax and punctuation are near-perfect." ~GitHub repository of GTP-2

For training the models we have used a GPU from Google Colab. We are also using a Google Could virtual machine to host our servers used for runing our trained AI models and for communicating with our Chrome Extension over HTTPS. As we opted for the large version of GPT-2 that includes 774M variables, our AIs take up large amounts storage space (~3 GB each) and require massive amounts of calculations for each article generation (it takes ~8 minutes for our server with 8 CPU cores running at 2.2 GHz to generate a new article it's never seen before).

We believe that this is the best possible use of Google Cloud Platform. The best possible use. Period.

Challenges we ran into

HTTPS...

Built With

Share this project:

Updates