Inspiration

The outrage over the release of OpenAI's GPT-2 deep learning model, and the insane articles circulating around the internet, it made it mysterious and interesting to look at.

What it does

Given a couple of sentences for context, the AI will write the following paragraphs given those sentences as context, sometimes for a good laugh, but still very impressive for a neural net.

How I built it

Originally, I wanted to salvage the original model that was used rather than this tinier version, but afterwards, I used the model on Google Cloud, connected Flask, and did some polishing of the output and the whole thing was shebang.

Challenges I ran into:

Interpreting Python written in Tensorflow, connecting Flask to Google Cloud. Getting a GPU on Google cloud was a pain, I was first told it would be two business days, but after emailing they gave me it right away.

Accomplishments that I'm proud of

Finally digging into a real Tensorflow model, and understanding how the graphs are made and instantiated was loads of fun.

What I learned

Google Cloud is amazing, v. roomy and welcoming of newcomers, very user friendly. Some Tensorflow syntax.

What's next for Oracle

Reconstruct the original model using the paper, or at least try to get better data to train on, and filter out the naughty words (more thoroughly) the model can sometimes spit out

Built With

Share this project:

Updates