Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp019z903221p
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorBotvinick, Matthew-
dc.contributor.authorKim, Young Sun (Lisa)-
dc.date.accessioned2015-07-29T13:46:28Z-
dc.date.available2015-07-29T13:46:28Z-
dc.date.created2015-04-30-
dc.date.issued2015-07-29-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp019z903221p-
dc.description.abstractRecognizing the limits of the existing semantic vector compositional models, we propose a novel recurrent neural network (RNN) language model that learns vector representations for sentences. Our method uses a small-scale sigmoid RNN model that takes word vector sequences as input and outputs a serial recall of the word vectors and the sigmoid hidden unit representations for the next sequences. We provide empirical evidence to suggest that the RNN sequential recall and next sequence prediction model is sensitive to the similarity of the input vector sequences and produces meaningful sigmoidsensitive to the similarity of the input vector sequences and produces meaningful sigmoid hidden unit representations that contain information regarding the words and the sentiment of each sequence. The results suggest several interesting possibilities for future work in extending the model to a full-scale recurrent neural language model that learns compositional vector representations for sentences. Keywords: language model, recurrent neural networks, compositional semanticsen_US
dc.format.extent35 pagesen_US
dc.language.isoen_USen_US
dc.titleLearning Sentence Representations with Recurrent Neural Networksen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2015en_US
pu.departmentComputer Scienceen_US
pu.pdf.coverpageSeniorThesisCoverPage-
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File SizeFormat 
PUTheses2015-Kim_Young_Sun_Lisa.pdf1.58 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.