Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01rv042x03m
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorCohen, Jonathan-
dc.contributor.authorSinha, Ishan-
dc.date.accessioned2020-08-12T13:53:54Z-
dc.date.available2020-08-12T13:53:54Z-
dc.date.created2020-05-12-
dc.date.issued2020-08-12-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01rv042x03m-
dc.description.abstractThe ability to extrapolate knowledge from familiar to novel domains is a defining feature of human intelligence. Contemporary neural network techniques, however, are primarily limited to interpolation among data in their training experience. In this work, we focus on neural networks’ capacity for arbitrary role-filler binding, the ability to associate abstract “roles” to context-specific “fillers,” which is a capacity that many have argued is an important mechanism underlying the ability to extrapolate. Using a simplified version of Raven’s Progressive Matrices, a hallmark test of human intelligence, we introduce a sequential formulation of a visual problem-solving task that requires this form of binding. Further, we introduce the Arbitrary Binding Network, a recurrent neural network model augmented with an external memory, and empirically demonstrate that it successfully learns the underlying abstract rule structure of our task and perfectly generalizes this rule structure to novel fillers.en_US
dc.format.mimetypeapplication/pdf-
dc.language.isoenen_US
dc.titleORIGINALen_US
dc.titleORIGINALen_US
dc.titleA Memory-Augmented Neural Network Model of Abstract Rule Learningen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2020en_US
pu.departmentComputer Scienceen_US
pu.pdf.coverpageSeniorThesisCoverPage-
pu.contributor.authorid961190377-
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File Description SizeFormat 
SINHA-ISHAN-THESIS.pdf5.41 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.