State-Space Models and Latent Processes in the Statistical Analysis of Neural Data
- State-Space Models and Latent Processes in the Statistical Analysis of Neural Data
- Vidne, Michael
- Thesis Advisor(s):
- Wiggins, Chris H.
- Ph.D., Columbia University
- Applied Physics and Applied Mathematics
- Persistent URL:
- This thesis develops and applies statistical methods for the analysis of neural data. In the second chapter we incorporate a latent process to the Generalized Linear Model framework. We develop and apply our framework to estimate the linear filters of an entire population of retinal ganglion cells while taking into account the effects of common-noise the cells might share. We are able to capture the encoding and decoding of visual stimulus to neural code. Our formalism gives us insight into the underlying architecture of the neural system. And we are able to estimate the common-noise that the cells receive. In the third chapter we discuss methods for optimally inferring the synaptic inputs to an electrotonically compact neuron, given intracellular voltage-clamp or current-clamp recordings from the postsynaptic cell. These methods are based on sequential Monte Carlo techniques ("particle filtering"). We demonstrate, on model data, that these methods can recover the time course of excitatory and inhibitory synaptic inputs accurately on a single trial. In the fourth chapter we develop a more general approach to the state-space filtering problem. Our method solves the same recursive set of Markovian filter equations as the particle filter, but we replace all importance sampling steps with a more general Markov chain Monte Carlo (MCMC) step. Our algorithm is especially well suited for problems where the model parameters might be misspecified.
- Item views
text | xml
- Suggested Citation:
- Michael Vidne, 2011, State-Space Models and Latent Processes in the Statistical Analysis of Neural Data, Columbia University Academic Commons, https://doi.org/10.7916/D88058JW.