Theses Doctoral

Theories of structure, dynamics, and plasticity in neural circuits

Clark, David Goubeaux

Neural circuits generate cognition, sensation, and behavior through the coordinated activity of many interconnected units. Understanding how these functions emerge dynamically and what connectivity structures support this emergence is a central challenge in neuroscience. This challenge is compounded by neural circuits' essential features: large numbers of components (neurons), nonlinear dynamics, complex recurrent interactions, and plastic connectivity. This thesis develops theoretical approaches to tackle this complexity, using tools from physics, particularly dynamical mean-field theory (DMFT), to analyze how connectivity structure shapes collective neuronal dynamics and computational functions in nonlinear recurrent neural networks.

The chapters of this thesis are loosely organized around three themes. First, I investigate how connectivity structure determines the structure of collective neuronal activity, focusing particularly on activity dimensionality (roughly, the number of high-variance modes). In Chapter 2, I develop a two-site cavity DMFT to calculate cross-covariances in random neural networks, revealing that networks with independent and identically distributed (i.i.d.) couplings exhibit extensive but fractionally low activity dimensionality and long population-level timescales. Chapter 3 extends this analysis using complementary path-integral fluctuation methods to handle the case of structured (non-i.i.d.) connectivity. Specifically, I introduce the random-mode model, which parameterizes coupling matrices using random input and output modes and enables control over the spectrum. Features of this spectrum manifest as features of collective activity that I compute, and can be undetectable when analyzing only single-neuron activities. I derive a simple relation between the effective rank of connectivity and activity dimensionality, and show how structured overlaps between input and output modes—a feature of biological circuits, as demonstrated using the Drosophila connectome—influence collective dynamics. In Chapter 4, I analyze multiregion neural networks where low-rank connectivity between regions, motivated by experimental studies, enables selective activity routing. Using cross-region currents as order parameters, I show that regions act as both generators and transmitters of activity—roles that are often in tension—and that effective signal routing can be achieved by exciting different high-dimensional activity patterns through connectivity structure and nonlinear dynamics.

Second, I examine attractor networks that represent continuous variables or discrete patterns through collective dynamics. Chapter 5 addresses the challenge of reconciling idealized theoretical models (namely, continuous attractors) with heterogeneous experimental data in the context of the rodent head-direction system. I use an optimization principle to construct recurrent networks that match actual mouse head-direction cell responses while exhibiting quasi-continuous-attractor dynamics. Developing and validating a statistical generative process for these responses allows for large-N analysis of such data-derived networks. The connectivity matrix exhibits doublet degeneracy in its spectrum at large N, reflecting an underlying circular geometry embedded in a disorderly manner within neuronal space. Analysis through DMFT reveals that the system becomes equivalent to a classical ring-attractor model as N→∞, defined by circularly symmetric Mexican-hat interactions. This approach extends to higher-dimensional symmetries, including grid cells in medial entorhinal cortex. Chapter 6 challenges conventional interpretations of associative memory models for discrete patterns by analyzing dynamics beyond equilibrium. I derive DMFT equations for dense associative memory models, a generalization of Hopfield networks, and show that patterns can be transiently retrieved with high accuracy above the traditional capacity limit, where stable attractors have vanished, because slow regions persist near stored patterns as traces of former basins of attraction.

Third, I explore plasticity and learning in neural networks. Chapter 7 studies networks where both neuronal units and synaptic couplings are dynamic variables, with couplings subject to Hebbian modification around quenched random strengths. This reveals a rich phase diagram. Hebbian plasticity can slow chaotic activity or induce chaos in quiescent networks, while anti-Hebbian plasticity quickens activity and produces an oscillatory component. Strong Hebbian plasticity segregates network timescales into two bands with a slow, synapse-dominated band driving the dynamics, suggesting a flipped view of the network as synapses connected by neurons. In chaotic states with strong Hebbian plasticity, I identify a phase of "freezable chaos" where stable fixed points of neuronal dynamics are continuously destabilized by synaptic dynamics, allowing any neuronal state to be stored as a stable fixed point by halting plasticity, thus offering a new working memory mechanism. Chapter 8 develops cavity methods for high-dimensional convex learning problems, providing unified analyses of perceptron classification of both points and manifolds, and kernel ridge regression by recognizing their shared bipartite structure. For perceptron-capacity problems, I identify a symmetry that allows derivation of correct capacities through a naïve method. Finally, turning to deep learning, Chapter 9 explores biologically plausible alternatives to backpropagation, presenting "global error-vector broadcasting" and "vectorized nonnegative networks" in which globally broadcast signals enable effective, i.e., gradient-aligned, credit assignment.

Overall, this thesis uses DMFT and other analytical and numerical tools—including random-matrix theory, iterative solution methods, and large-N simulations—as well as some data analysis, to make progress on various questions surrounding the structure-function relationship in large, nonlinear recurrent neural circuits. In the Introduction, I outline various open questions, particularly the challenge of understanding how neural circuits implement inherently high-dimensional computations through recurrent dynamics. In the Conclusion, I speculate on where the future could take us.

Files

  • thumbnail for Clark_columbia_0054D_19459.pdf Clark_columbia_0054D_19459.pdf application/pdf 8.23 MB Download File

More About This Work

Academic Units
Neurobiology and Behavior
Thesis Advisors
Abbott, Larry
Degree
Ph.D., Columbia University
Published Here
September 17, 2025