Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01s7526g34f
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Mittal, Prateek | - |
dc.contributor.author | Wagh, Sameer | - |
dc.contributor.other | Electrical Engineering Department | - |
dc.date.accessioned | 2020-07-13T03:32:27Z | - |
dc.date.available | 2020-07-13T03:32:27Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/dsp01s7526g34f | - |
dc.description.abstract | Applications of machine learning have become increasingly common in recent years. For instance, navigation systems like Google Maps use machine learning to better predict traffic patterns; Facebook, LinkedIn, and other social media platforms use machine learning to customize user's news feeds. Central to all these systems is user data. However, the sensitive nature of the collected data has also led to a number of privacy concerns. Privacy-preserving machine learning enables systems that can perform such computation over sensitive data while protecting its privacy. In this dissertation, we focus on developing efficient protocols for machine learning as a target analytics application. To incorporate privacy, we use a multi-party computation-based approach. In multi-party computation, a number of non-colluding entities jointly perform computation over the data and privacy stems from no party having any information about the data being computed on. At the heart of this dissertation are three frameworks -- SecureNN, FALCON, and Ponytail -- each pushing the frontiers of privacy-preserving machine learning and propose novel approaches to protocol design. SecureNN and FALCON introduce, for the first time, highly efficient protocols for computation of non-linear functions (such as rectified linear unit, maxpool, batch-normalization) using purely modular arithmetic. Ponytail demonstrates the use of homomorphic encryption to significantly improve over prior art in private matrix multiplication. Each framework provides both significant asymptotic as well as concrete efficiency gains over prior work by improving computation as well as communication performance by an order of magnitude. These building blocks -- matrix multiplication, rectified linear unit, maxpool, batch-normalization -- are central to machine learning and improvements to these significantly improve upon prior art in private machine learning. Furthermore, each of these systems is implemented and benchmarked to reduce the barrier of deployment. Uniquely positioned at the intersection of both theory and practice, these frameworks bridge the gap between plaintext and privacy-preserving computation while contributing new directions for research to the community. | - |
dc.language.iso | en | - |
dc.publisher | Princeton, NJ : Princeton University | - |
dc.relation.isformatof | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a> | - |
dc.subject | Applied Cryptography | - |
dc.subject | Homomorphic Encryption | - |
dc.subject | Multi-Party Computation | - |
dc.subject | Privacy Enhancing Technologies | - |
dc.subject | Privacy-Preserving Machine Learning | - |
dc.subject.classification | Computer science | - |
dc.subject.classification | Computer engineering | - |
dc.title | New Directions in Efficient Privacy-Preserving Machine Learning | - |
dc.type | Academic dissertations (Ph.D.) | - |
Appears in Collections: | Electrical Engineering |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Wagh_princeton_0181D_13320.pdf | 1.87 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.