Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01k0698b331
Title: Sparse and Efficient Transfer Learning via Winning Lottery Tickets
Authors: Mehta, Rahul
Advisors: Arora, Sanjeev
Department: Computer Science
Certificate Program: Center for Statistics and Machine Learning
Class Year: 2019
Abstract: In this thesis, we extend the Lottery Ticket Hypothesis of Frankle & Carbin (ICLR `19) to a variety of transfer learning problems. We identify sparse, trainable sub-networks that can be found on a source dataset and transferred to a variety of down-stream tasks. Our results show that sparse sub-networks with approximately 85-95% of weights removed exceed the accuracy of the original network when transferred to other tasks. We experimentally show that a sparse representation learned by a deep convolutional network trained on CIFAR-10 can be transferred to SmallNORB and FashionMNIST in a number of realistic settings. In addition, we show the existence of the first sparse, trainable sub-networks for natural language tasks; in particular, we show that BERT with up to 81.5% of parameters removed can reach the original test accuracy for the CoNLL-2003 Named Entity Recognition task.
URI: http://arks.princeton.edu/ark:/88435/dsp01k0698b331
Type of Material: Princeton University Senior Theses
Language: en
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File Description SizeFormat 
MEHTA-RAHUL-THESIS.pdf1.1 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.