Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01b2773z61t
Title: Statistics Meets Nonconvex Optimization: Computational Efficiency and Uncertainty Quantification
Authors: Ma, Cong
Advisors: Chen, Yuxin
Fan, Jianqing
Contributors: Operations Research and Financial Engineering Department
Subjects: Statistics
Operations research
Issue Date: 2020
Publisher: Princeton, NJ : Princeton University
Abstract: In recent years, there has been an explosion of interest in designing fast nonconvex optimization algorithms to solve statistical estimation and learning problems. However, in contrast to convex optimization that has become a real pillar of modern engineering, the theoretical foundations of nonconvex optimization are far from satisfactory, especially in terms of its computational and inferential properties. The main focus of this thesis is therefore to develop a fundamental understanding of nonconvex optimization in statistical estimation and inference. Starting with statistical estimation, we have shown that • vanilla gradient descent, in conjunction with spectral initialization, converges linearly for a family of nonconvex statistical learning problems including phase retrieval, matrix completion, and blind deconvolution; • gradient descent — when initialized randomly — implicitly avoids all the saddle points, thus enabling optimal convergence rates and statistical accuracy at once for the phase retrieval problem. In addition to demonstrating the computational advantages of nonconvex optimization, we also uncover an interesting and inherent connection of nonconvex optimization to convex relaxation: they are essentially equivalent for solving the noisy matrix completion problem. This connection allows us to establish the minimax optimality of convex relaxation vis-a`-vis random noise. Going beyond estimation, we further develop a simple procedure to compensate for the bias of the widely used convex and nonconvex estimators for noisy matrix completion. The resulting de-biased estimators admit nearly precise non-asymptotic distributional characterizations, which in turn enable optimal construction of confidence intervals/regions for, say, the missing entries and the low-rank factors. All of this is achieved via an integrated view of statistics and optimization, which we highlight throughout the thesis.
URI: http://arks.princeton.edu/ark:/88435/dsp01b2773z61t
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Operations Research and Financial Engineering

Files in This Item:
File Description SizeFormat 
Ma_princeton_0181D_13339.pdf8.04 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.