Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01qf85nf26k
Title: ORIGINAL
ORIGINAL
Programming A Poet: Poetry Text Generation Using LSTM Models
ORIGINAL
Hejduk_Senior_Thesis.pdf
Authors: Hong, Katherine
Advisors: Narasimhan, Karthik
Department: Electrical Engineering
Class Year: 2020
Abstract: I programmed Long Short-Term Memory (LSTM) models to generate poems using Walt Whitman’s poetry collection. I programmed two different models: a character level model, which generates text by character; and a word level model, which generates text by word. Within each model, I also experimented with different parameters. I wrote a baseline model, a wide model which has double number of cells in each LSTM layer than the baseline model, a deep model which adds one more LSTM layer, and a wide and deep model which combines both features. I used perplexity to measure the prediction ability of the generative models. By evaluating the generated poems and their perplexities, I conclude that the word level model is far superior than the character level model. Within the word level model, the wide and deep model produces the best quality of poems, although its perplexity is sometimes slightly higher. After sufficient training, the poems generated by the word level model are meaningful, expressive, and thematic.
URI: http://arks.princeton.edu/ark:/88435/dsp01qf85nf26k
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Electrical Engineering, 1932-2020

Files in This Item:
File SizeFormat 
HONG-KATHERINE-THESIS.pdf268.96 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.