Go Back

About

Machine Wisdom is built using OpenAI's GPT-2 model, which was trained on tens of thousands of quotes from famous authors, artists and philosophers.

GPT-2 is a large language model which can be used to generate spans of coherent text. This website uses the small version of GPT-2 (117 millions of parameters) which was finetuned on around 60 thousand quotes for ~1700 steps (batch size of 32). A simple sample-and-rank decoding strategy was used similarly to a recent chatbot paper from Google.

The model's generation is filtered to only include quotes that don't exist already (from those present in the training data). However, since the original GPT-2 model was trained on a large amount of text from the Internet, it is possible that a small number of the quotes generated can be found out there.

To learn more about this model or explore some of my other projects, visit my personal website!


To explore similar AI projects generating other types of synthetic content, you can also check out this curated list on GitHub.