Theses Doctoral

State-Space Models and Latent Processes in the Statistical Analysis of Neural Data

Vidne, Michael

This thesis develops and applies statistical methods for the analysis of neural data. In the second chapter we incorporate a latent process to the Generalized Linear Model framework. We develop and apply our framework to estimate the linear filters of an entire population of retinal ganglion cells while taking into account the effects of common-noise the cells might share. We are able to capture the encoding and decoding of visual stimulus to neural code. Our formalism gives us insight into the underlying architecture of the neural system. And we are able to estimate the common-noise that the cells receive. In the third chapter we discuss methods for optimally inferring the synaptic inputs to an electrotonically compact neuron, given intracellular voltage-clamp or current-clamp recordings from the postsynaptic cell. These methods are based on sequential Monte Carlo techniques ("particle filtering"). We demonstrate, on model data, that these methods can recover the time course of excitatory and inhibitory synaptic inputs accurately on a single trial. In the fourth chapter we develop a more general approach to the state-space filtering problem. Our method solves the same recursive set of Markovian filter equations as the particle filter, but we replace all importance sampling steps with a more general Markov chain Monte Carlo (MCMC) step. Our algorithm is especially well suited for problems where the model parameters might be misspecified.

Files

  • thumnail for Vidne_columbia_0054D_10478.pdf Vidne_columbia_0054D_10478.pdf application/pdf 14.9 MB Download File

More About This Work

Academic Units
Applied Physics and Applied Mathematics
Thesis Advisors
Wiggins, Chris H.
Degree
Ph.D., Columbia University
Published Here
December 20, 2011