Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01x633f1146
Title: Scalable Inference with Approximate Variational Inference on a Latent Source Model of the Brain
Authors: Saparov, Abulhair
Advisors: Norman, Ken
Contributors: Blei, David
Department: Computer Science
Class Year: 2013
Abstract: Bayesian probabilistic modeling is becoming an increasingly promising tool to use in neuroscience. Many brain imaging methods, such as electroenchephalography (EEG) or functional magnetic resonance imaging (fMRI) do not have very high spatial resolution, and so the brain dynamics at small scales must be studied using other methods. Probabilistic models enables the principled exploration of phenomena that are difficult to measure, such as the processes governing the storage and retrieval of words. Topographic latent source analysis (TLSA) is a novel, fully-Bayesian probabilistic model that describes brain activity as a covariate-dependent linear sum of latent sources of activity. Gibbs sampling was used originally to fit brain imaging data to TLSA, but as a sampling algorithm, its performance limited its scalability. Variational inference is a new approach to Bayesian inference that transforms the problem of inference into one of optimization, providing greatly improved performance. As a generalization of expectation maximization (EM), variational inference is guaranteed to converge. Unfortunately, mean-field variational inference can only be applied to a class of models that satisfy conditional conjugacy, and TLSA does not fall into this class. More recent work successfully extended variational inference to a much broader class of models by using Laplace approximations: the variational distributions of the non-conjugate hidden variables are assumed to be normal. In my work, I apply Laplace variational inference to TLSA, deriving the variational updates. I discuss the issue of convergence in detail, and the approaches that can be taken to avoid those problems. Finally, I derive a stochastic variational inference equivalent of the algorithm, which dramatically reduces the running time of the algorithm.
Extent: 50 pages
URI: http://arks.princeton.edu/ark:/88435/dsp01x633f1146
Access Restrictions: Walk-in Access. This thesis can only be viewed on computer terminals at the Mudd Manuscript Library.
Type of Material: Princeton University Senior Theses
Language: en_US
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File SizeFormat 
Abulhair Saparov.pdf1.21 MBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.