Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01v692t914h
Title: ORIGINAL
Pseudo-Random Weight Initialization in Deep Neural Networks
ORIGINAL
Authors: Draper, John
Advisors: Adams, Ryan P
Adams, Ryan P
Adams, Ryan P
Sly, Allan
Department: Mathematics
Class Year: 2020
Abstract: Deep neural networks are incredibly complex learning models whose performance is greatly dependent on the random initialization of their connecting weights. Training two neural networks of the same architecture with different weight initializations will almost inevitably lead to two completely different networks, each learning to perform its task in a unique way. This thesis investigates the use of an elliptical sampling technique to produce pseudo-random initializations for a rudimentary deep neural network. By using this technique to gradually change the weight initialization of our network, we hope to gain a better understanding of the complicated mechanics underlying its learning process. Additionally, we present the method of iterative weight refinement, which takes advantage of elliptical sampling techniques to optimize the performance of a given metric by repeatedly improving the choice of weight initialization. While this technique has certain limitations, it offers a clear and systematic method for manipulating weight initializations to improve the performance of a neural network.
URI: http://arks.princeton.edu/ark:/88435/dsp01v692t914h
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Mathematics, 1934-2020

Files in This Item:
File Description SizeFormat 
DRAPER-JACK-THESIS.pdf1.4 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.