Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01v692t8641
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorFellbaum, Christiane-
dc.contributor.authorMadge, Saahil-
dc.date.accessioned2016-06-29T14:22:39Z-
dc.date.available2016-06-29T14:22:39Z-
dc.date.created2016-04-29-
dc.date.issued2016-06-29-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01v692t8641-
dc.description.abstractAbstract We present a general approach for machine comprehension tasks by converting the text to a knowledge graph and the questions to queries on the graph. We extend [19] and use the Stanford NLP Toolkit’s Dependency Parser [17, 9] to transform each sentence into a set of entity-relation triples. We use word2vec [18] to convert the questions into queries on the graph. We present a tensor decomposition approach to answering queries by adding Semantically Smooth Embedding [11] to RESCAL [20]. We also generalize the Memory Networks [28, 25] architecture to take any knowledge graph as input. We evaluate these models on three full SAT reading comprehension tests. The models presented here outperform their respective baselines. Both models demonstrate the ability to capture the semantic and structural information in the text and answer questions using that information.en_US
dc.format.extent57en_US
dc.language.isoen_USen_US
dc.titleTensor Decomposition and Memory Networks for SAT Reading Comprehensionen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2016en_US
pu.departmentComputer Scienceen_US
pu.pdf.coverpageSeniorThesisCoverPage-
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File SizeFormat 
Madge_Saahil_2016_thesis.pdf905.23 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.