Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp0179408079v
Title: | On the Two-Sample Statistic Approach to Generative Adversarial Networks |
Authors: | Liu, Lydia |
Advisors: | Liu, Han |
Department: | Operations Research and Financial Engineering |
Certificate Program: | Applications of Computing Program |
Class Year: | 2017 |
Abstract: | We investigate the deeper use of maximum mean discrepancy (MMD), a statistical measure of the distance between distributions, in training generative adversarial networks (GAN), a framework for generative modeling using deep neural networks. The algorithm that uses MMD as a criterion to train generative models parametrized by deep neural network is called generative moment matching networks (GMMN).One of the goals of this work is to understand when MMD is a more effective loss function for training neural samplers than the GAN objective. By performing experiments with simulated data, we found that the original GAN actually performs worse than GMMN when the data does not have low-dimensional structure.We explore using extensions of MMD as the loss criterion in GMMN. In particular, these extensions are adaptive to the data. Our results suggest we could achieve state-of-the-art results with GMMN by using more sophisticated variants of MMD. We also show that MMD can be used as a regularizer to improve the stability of GANs. |
URI: | http://arks.princeton.edu/ark:/88435/dsp0179408079v |
Type of Material: | Princeton University Senior Theses |
Language: | en_US |
Appears in Collections: | Operations Research and Financial Engineering, 2000-2019 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
thesis.pdf | 2.28 MB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.