Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01j67316077
Title: | Dictionary Learning and Anti-Concentration |
Authors: | Simchowitz, Max |
Advisors: | Arora, Sanjeev |
Contributors: | Singer, Amit |
Department: | Mathematics |
Class Year: | 2015 |
Abstract: | As central as concentration of measure is to the field of statistical learning theory, this thesis aims to motivate anti-concentration as a promising and under-utilized toolkit for the design and analysis of statistical learning algorithms. This thesis focuses on learning incoherent dictionaries A∗ from observations y = A∗x, where x is a sparse coefficient vector drawn from a generative model. We impose an exceedingly simple anti-concentration property on the entries of x, which we call (C, ρ)-smoothness. Leveraging this assumption, we present the first computationally efficient, provably correct algorithms to approximately recover A∗ even in the setting where neither the non-zero coordinates of x are guaranteed to be Ω (1) in magnitude, nor are the supports x chosen in a uniform fashion. As an application of our analytical framework, we present an algorithm with run-time and sample complexity polynomial in the dimensions of A∗ , and logarithmic in the desired precision, which learns a class of randomly generated Non-Negative Matrix Factorization instances up to arbitrary inverse polynomial error, with high probability. |
Extent: | 99 pages |
URI: | http://arks.princeton.edu/ark:/88435/dsp01j67316077 |
Type of Material: | Princeton University Senior Theses |
Language: | en_US |
Appears in Collections: | Mathematics, 1934-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
PUTheses2015-Simchowitz_Max.pdf | 772.88 kB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.