Ranked Keywords to Story Generation

img
This project attempts the following task: given a set of ranked keywords, construct a coherent short story. The goal is for the model to use all the keywords, while still being grammatically and logically correct. One could imagine this task being used to inspire writers with creative story ideas. For example, given the words `josh, streets, living, adopted, happy', the model could output: Josh is a black dog. He was living on the streets. A nice man stopped when he saw Josh. He became attached to Josh. So the man adopted Josh, and Josh is very happy with his new family. To solve this task, we use both the traditional method of finetuning large pretrained language models as well as the recently introduced Plug and Play Language Model(PPLM) strategy, which leverages the power of pretrained language models without finetuning them. For the Plug and Play strategy, we introduced custom attribute models to guide language models to generate stories containing the desired keywords, especially those with higher rank. Unlike the original PPLM paper which focuses on perturbing the generation of zero-shot unconditioned language models, we experimented with zero-shot, low-resource, and fine-tuned language model choices, and compared the relative improvement in the PPLM generations. We found that finetuned langauge models perform much better than the default PPLM approach, but our custom combination of finetuned language model + attribute model performed the best overall. Finally, we performed error analysis on all our approaches and find that in spite of introducing more grammar mistakes, PPLM improved keywords usage, reduced the number of contradictory sentences in the stories, and generated stories with better endings.