App

Thanks for reading our submission! :-)

Try our app here.

Read our blog here (it is also pasted below).

Our code can be found here.

Summary

Inspiration

Modern deep learning language models have been used to create text generators. For example:

While these projects create free form text, we wanted to explore text generation under specific constraints. Rhymes are a great use case for this as provides us with a very visible constraint. Christmas is a great time to make a rhyme generator because many people write poems for their loved ones.

What it does

Our app provides people who want to write a poem with rhyme inspiration.

The user writes a prompt and then receives multiple full rhyming sentences.

How we built it

We were able to build this project in only a few days by building on top of some great services:

Our generative algorithm works as follow:

  • We have a BERT Masked Language Model
  • We find a rhyming word to the user prompt sentence's last word
  • We create a placeholder second sentence "[MASK] [MASK] ... [MASK] [RHYME-WORD]"
  • Our (Markov Chain Monte Carlo) algorithm then improves this sentence:
    • Choose a random token of the sentence
    • Replace it with [MASK]
    • Let Bert Mask Language Model predict how well each token in the vocabulary would fit there
    • Randomly choose a token according to those predict probabilities
    • Repeat

Challenges we ran into

Our original goal was to generate complete poems. However, we soon found out that:

  1. With only one starting sentence the quality of the generated poems is too low, and the runtime was too slow.
  2. Users are not looking for a fully generated text; they don't want to outsource writing a poem to an algorithm. People want to make personal poems for their friends and family, for that they need inspiration.

Accomplishments that we're proud of

We are most proud of how we turned a weakness into a strength. The weakness was the slow speed of the generative algorithm. The strength is that every step is fun to look at and can inspire the writer with new ideas.

Second we are proud of how we accomplished this project in such a short time. In only three days we built this project from start to finish. This was possible due to the great tools available to us: streamlit, hugging face transformers, google cloud app engine and the datamuse rhyme api.

What we learned

  • Constraints are opportunities to be turned into strengths.
  • Bert's masked language modeling task is great fun for text generation with constraints.
  • A bunch of small tweaks can really improve the quality of generated text.

What's next for Rhyme With AI

Narrow view:

We believe our service can be useful to revise draft poems. Think: A sort of hemingwayapp.com for poetry and lyrics. We'd need to add one component that identifies the sentence most in need of being revised.

Broader view:

While poetry is a fun use case, and has a very explicit constraint (rhyme). We believe there are real world use cases that can leverage our way of working. Many domain specific texts follow a clear script. An editor which can help revise these sentences or suggest next sentences can be very helpful. Potential applications include: job advertisements, news articles, and academic papers.

Blog

The season's upon us, it's that time of year; Creating Christmas rhymes instills in us great fear! Luckily, pre-trained neural networks are easy to apply. With great pride we introduce our new assistant: Rhyme with AI.

NLP's ImageNet moment may have arrived in 2018, but the ecosystem around NLP models really has matured in 2019. Many of these models (BERT, GPT-2, Transformer-XL, DistilBERT, etc.) are easy to use for your use cases. Our service uses BERT to help us with our (Christmas) rhymes.

Language modeling helps state-of-the-art models understand languages before solving tasks like sentiment analysis or translation. Masking, where the model tries to predict a word that is hidden from a sentence, is one of BERT's innovations. We can use it to help us rhyme by rephrasing rhyming as a task to predict missing words.

bert Predicting masked tokens is one of BERT's language modelling techniques. Source: "The Illustrated BERT, ELMo, and co."

Our problem involves multiple masks: we know the first sentence and the last word of the second sentence. For instance:

Santa delivers gifts by sleigh

  • ... [MASK] [MASK] [MASK] [MASK] [MASK] [MASK] [MASK] [MASK] day

Inspired by BERT has a Mouth, and It Must Speak, we first let BERT fill in the [MASK]'s and then randomly sample new tokens. Some example rhymes from our model:

Santa delivers gifts by sleigh

  • ... and drinks and celebrates his wedding day
  • ... dressed as preacher and nurse they say
  • ... or bicycle if he has to pay

This already looks pretty good, but we need a solution that people can use!

Rhyme-with-AI

Luckily, creating an end-to-end machine learning solution is fairly simple. The Datamuse API gives back rhyme words, BERT is available via huggingface and creating an app is no sweat with streamlit. Put it all together in a Docker container and a hosted solution is one command away with Google App Engine.

A few days well spent for us will hopefully save you a lot of pain. Check out or code on GitHub or try our solution at Rhyme with AI!

Built With

  • appengine
  • huggingface
  • python
  • streamlit
  • tensorflow
Share this project:

Updates