Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp014f16c555s
Title: Deep Prose: Neural Style Transfer of Text
Authors: Han, Sheon
Advisors: Fellbaum, Christiane
Department: Computer Science
Class Year: 2018
Abstract: Neural Style Transfer refers to the task of generating new outputs by combining the content from one input with the style of another input. Since Gatys et al. used Convolutional Neural Networks (CNNs) to separate the content and style from images and recreate new images with different artistic styles, Neural Style Transfer has been actively researched in the field of Computer Vision. However, there have not been as many attempts in the field of Natural Language Processing, owing partly to the fact that natural language is discrete, rendering the approach using CNNs ineffective. In this paper, we employ recent techniques using the deep neural network architecture such as Recurrent Neural Networks (RNNs) and Variational Autoencoders (VAEs) to perform Neural Style Transfer of text. We use three datasets—The New Yorker, The New York Times, the Blog Authorship Corpus—to train our language generation models that can take any text as input and generate any length of new text with different styles. We use BLEU, classifiers, and human readers to evaluate our result. Our results demonstrate that some language generation models are powerful enough to learn subtle grammatical rules (e.g., enclosing a quotation with double quotation marks). But sample outputs show that the models are nonetheless oblivious to the semantic content of the input and finer rules of good prose.
URI: http://arks.princeton.edu/ark:/88435/dsp014f16c555s
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File Description SizeFormat 
HAN-SHEON-THESIS.pdf1.69 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.