I had heard of but never personally explored the art in artificial intelligence. Inspired by the poetry I write, the incredible art I have seen and the drive to leverage machine learning in order to give myself as well as the AI agent I programmed a voice, I made this project!
What I worked on 👀
I made Sh.a.i.kspear, a combination of several programs, algorithms, machine learning models and more that generates poetry, lyrics and prose given a prompt by a user. My focus was less on creating a product with a user interface for others to test out en masse but on exploring the capacity for current technology to create good art that will be appreciated. I compiled Sh.a.i.kspear's poetry into a book titled The Poetry Of Programming (which contains visuals generated through an AI method known as style transfer) and a pop song written using machine learning as well.
How I built it 🦾
Using the Generative Pre-trained Transformer 2 (GPT-2) model and Hidden Markov Models, I successfully managed to generate coherent, complex and creative text! The poetry book was created by simply taking the best generated text and formatting it into a presentable product. On the other hand, the pop song lyrics were run through IBM's state of the art text to speech generator in order to produce high quality sounds for the lyrical performance.
Challenges I ran into 🧠
I initially tried making a LSTM based text generator and also tried using simple Markov Models, however, these did not achieve the level of complexity and detail I was looking for. Implementing these allowed me to understand how the two work better. Nevertheless, moving from this hurdle to using GPT-2 and Hidden Markov Models allowed me to deepen my understanding of how Natural Language Processing (NLP) can both be achieved as well as how it has developed over the years.
Accomplishments that I'm proud of 🔥
I have been to many hackathons in the past, however, this was my first solo hackathon experience. I feel accomplished to have worked on a project by myself from start to end!
What I learned ✨
GPT-2 is great at NLP! It will be interesting to continue monitoring and using the GPT series in the future. There was also a rich discussion that occurred during the hackathon with another person regarding the implications of better language models. After all, having a large-scale unsupervised language model which generates coherent paragraphs of text is both an incredibly powerful and dangerous development in the world of machine learning and AI. On another note, I realized Hidden Markov Models paired with some logic can turn coherently generated sentences from NLP models into stylized outputs (with rhyme and meter) such as the ones I used in the poetry book and song.
What's next for Sh.a.i.kspear 🚀
Using GPT-3 and testing whether Hidden Markov Models are NOT required when using this new SOTA model would be interesting. Additionally, turning the program I used to create the two outputs into an accessible app or website for others to use and generate their own poetry and songs would be a good next step!