Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/99999/fk4bz7nf10
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Pillow, Jonathan W | |
dc.contributor.author | Morais, Michael J | |
dc.contributor.other | Neuroscience Department | |
dc.date.accessioned | 2021-06-10T17:14:58Z | - |
dc.date.available | 2021-06-10T17:14:58Z | - |
dc.date.issued | 2021 | |
dc.identifier.uri | http://arks.princeton.edu/ark:/99999/fk4bz7nf10 | - |
dc.description.abstract | One fundamental goal of theoretical neuroscience is to understand the normative principles governing the functional organization of neural circuits, and, in turn, to what extent they can be considered optimal. Calling neural representations of information in the brain ``optimal'' implies a multifarious equilibrium that balances robustness against flexibility, completeness against relevance, and so on, but it need only imply a solution to some optimization program. The exact forms of these programs varies with the modeling goals, neural circuits, tasks, or even animals under investigation. With this dissertation, we explore how we can define neural codes as optimal when they generate optimal behavior -- an easy principle to state, but a hard one to implement. Such a principle would bridge a gap between classical hypotheses of optimal neural coding, efficient coding and the Bayesian brain, with a common unified theory. In the first study, we analyzed neural population activity in V1 while monkeys performed a visual detection task, and found that a majority of the total choice-related variability is already present in V1 population activity. Such a prominent contribution of non-stimulus activity in classically sensory regions cannot be incorporated into existing models of neural coding, and demands models that can jointly optimize coding and decision-making within a single neural population. In the second study, we derived power-law efficient codes, a natural generalization of classical efficient codes, and show they are sufficient to replicate and explain a diverse set of psychophysical results. This broader family can maximize mutual information or minimize error of perceptual decisions, suggesting that psychophysical phenomena used to validate normative models could be more general features of perceptual systems than previously appreciated. In the third study, we translated the problem of joint model learning and decision-making into Bayesian machine learning, and extended a family of methods for decision-aware approximate inference to include a novel algorithm that we called loss-calibrated expectation propagation. How this problem can be solved by a non-biophysical system could be a constructive reference point for future studies into joint coding and decision-making, and the normative principles that drive decision-related variability in optimal sensory neural codes. | |
dc.language.iso | en | |
dc.publisher | Princeton, NJ : Princeton University | |
dc.relation.isformatof | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu> catalog.princeton.edu </a> | |
dc.subject | approximate inference | |
dc.subject | Bayesian statistics | |
dc.subject | decision-making | |
dc.subject | efficient coding | |
dc.subject | neural coding | |
dc.subject | perception | |
dc.subject.classification | Neurosciences | |
dc.subject.classification | Statistics | |
dc.title | Approximate Bayesian methods for optimal neural coding and decision-making | |
dc.type | Academic dissertations (Ph.D.) | |
Appears in Collections: | Neuroscience |
Files in This Item:
File | Size | Format | |
---|---|---|---|
Morais_princeton_0181D_13681.pdf | 9.17 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.