Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01zw12z792j
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorRamadge, Peter J.-
dc.contributor.authorHo, Katy-
dc.date.accessioned2017-07-24T14:24:14Z-
dc.date.available2017-07-24T14:24:14Z-
dc.date.created2017-05-08-
dc.date.issued2017-5-8-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01zw12z792j-
dc.description.abstractThis paper explores how different parameters of the memory of a recurrent neural network affect learning performance. The size of memory and the effects of long term dependencies are studied in depth for simple, yet interesting problems in which memory plays a central role. Different types of recurrent neural network architectures and structured recurrent neural networks are also explored.en_US
dc.language.isoen_USen_US
dc.titleLearning With Memory in Neural Networksen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2017en_US
pu.departmentElectrical Engineeringen_US
pu.pdf.coverpageSeniorThesisCoverPage-
pu.contributor.authorid960864782-
pu.contributor.advisorid010002518-
pu.certificateApplications of Computing Programen_US
Appears in Collections:Electrical Engineering, 1932-2020

Files in This Item:
File SizeFormat 
Ho_Katy.pdf1.58 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.