Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01hm50tv16g
Title: | Dynamics of Recurrent Neural Network Models of Working Memory |
Authors: | Wawrzonek, Christian |
Advisors: | Buschman, Timothy |
Department: | Computer Science |
Class Year: | 2016 |
Abstract: | How do populations of neurons encode information over very short time scales? Given extensive training, neurons are able to change the weighted connections between them in order to encode information. However, very short timescales of only a few seconds are far too short to change neural weights. Still, humans and higher functioning animals pos- sess the ability to encode and maintain small amounts of information presented over very short timescales. This is the problem of working memory, the transient holding, processing, and manipulation of information used in higher cognitive functioning. Previous computa- tional models of working memory have typically been constructed with a strict, hand-tuned architecture. Here, we attempted to train a relatively simple, unconstrained neural network on complex working memory tasks and analyzed the natural solution space found by the network. Through a range of analyses, it is clear that even a simple, single layer recurrent network is capable of dynamic, generalized solutions without deliberate solution paths presented [8]. |
Extent: | 41 pages |
URI: | http://arks.princeton.edu/ark:/88435/dsp01hm50tv16g |
Type of Material: | Princeton University Senior Theses |
Language: | en_US |
Appears in Collections: | Computer Science, 1988-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
Wawrzpmek_Christian_2016_Thesis.pdf | 3.1 MB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.