Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01pz50h0025
Title: | Bayesian latent structure discovery for large-scale neural recordings |
Authors: | wu, anqi |
Advisors: | Pillow, Jonathan W. Norman, Kenneth Andrew |
Contributors: | Neuroscience Department |
Keywords: | Bayesian probabilistic modeling brain analysis Gaussian process latent variable model neural recording statistical model |
Subjects: | Neurosciences Computer science Artificial intelligence |
Issue Date: | 2019 |
Publisher: | Princeton, NJ : Princeton University |
Abstract: | Many studies in neuroscience posit that large-scale neural activity reflects noisy high-dimensional observations of some underlying, low-dimensional signals of interest. One approach to the problem of identifying such signal is to develop latent variable models that formalize the relationship between low-dimensional signals and high-dimensional measurements of neural activity. Low-dimensional structures we extract can help shed light on how information is encoded at the population level, and provide significant scientific insights into the brain and human behavior. In recent years, there has been rapid development in recording techniques which enables us to have a large amount of neural data to analyze. With these data, we can develop inferential and statistical techniques so as to understand and interpret the latent structures underlying high-dimensional neural data. In this thesis, we will develop Bayesian latent variable models for five different neural tasks. In particular, we focus on Gaussian process latent variable models, which provide a flexible and interpretable way of modeling the mapping from the latent to the observed neural data as well as modeling the latent structures. Gaussian process is powerful in imposing smooth assumptions over functions which we will employ in different scenarios. However, due to the complexity of Gaussian process based modeling, we also need to develop efficient and scalable inference methods for fitting these models to data. We propose decoupled Laplace approximation and block coordinate descent for Gaussian process latent variable models and a moment-based efficient estimator for quadratic convolutional subunit models. The discovered latent structures provide scientific insights into neural behaviors so as to facilitate greater understanding of the brain. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01pz50h0025 |
Alternate format: | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu |
Type of Material: | Academic dissertations (Ph.D.) |
Language: | en |
Appears in Collections: | Neuroscience |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
wu_princeton_0181D_13116.pdf | 27.68 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.