Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp019z903221p
Title: Learning Sentence Representations with Recurrent Neural Networks
Authors: Kim, Young Sun (Lisa)
Advisors: Botvinick, Matthew
Department: Computer Science
Class Year: 2015
Abstract: Recognizing the limits of the existing semantic vector compositional models, we propose a novel recurrent neural network (RNN) language model that learns vector representations for sentences. Our method uses a small-scale sigmoid RNN model that takes word vector sequences as input and outputs a serial recall of the word vectors and the sigmoid hidden unit representations for the next sequences. We provide empirical evidence to suggest that the RNN sequential recall and next sequence prediction model is sensitive to the similarity of the input vector sequences and produces meaningful sigmoidsensitive to the similarity of the input vector sequences and produces meaningful sigmoid hidden unit representations that contain information regarding the words and the sentiment of each sequence. The results suggest several interesting possibilities for future work in extending the model to a full-scale recurrent neural language model that learns compositional vector representations for sentences. Keywords: language model, recurrent neural networks, compositional semantics
Extent: 35 pages
URI: http://arks.princeton.edu/ark:/88435/dsp019z903221p
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File SizeFormat 
PUTheses2015-Kim_Young_Sun_Lisa.pdf1.58 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.