2021 Theses Doctoral
A Mathematical Study of Learning Dynamics
Data-driven discovery of dynamics, where data is used to learn unknown dynamics, is witnessing a resurgence of interest as data and computational tools have become widespread and increasingly accessible. Advances in machine learning, data science, and neural networks are fueling new data-driven studies and rapidly changing the landscape in almost every field. Meanwhile, classical numerical analysis remains a steady tool to analyze these new problems.
This thesis situates emerging works coupling machine learning, neural networks, and data-driven discovery of dynamics in classical numerical theory. We begin by formulating a universal learning framework based in optimization theory. We discuss how three paradigms of machine learning -- supervised, unsupervised, and reinforcement learning -- are encapsulated by this framework and form a general learning problem for discovery of dynamics.
Using this formulation, we distill data-driven discovery of dynamics using the classical technique of linear multistep methods with neural networks to its most basic roots for numerical analysis. We establish for the first time a rigorous mathematical theory for using linear multistep methods in discovery of dynamics assuming exact data. We present refined notions of consistency, stability, and convergence for discovery and show convergence results for the popular schemes of Adams-Bashforth, Adams-Moulton, and Backwards Differentiation Formula. Extending the study for noisy data, we propose and analyze the recovery of a smooth approximation to the state using splines and prove new results on discrete differentiation error estimates.
Subjects
Files
-
Keller_columbia_0054D_16311.pdf application/pdf 1.22 MB Download File
More About This Work
- Academic Units
- Applied Physics and Applied Mathematics
- Thesis Advisors
- Du, Qiang
- Degree
- Ph.D., Columbia University
- Published Here
- December 28, 2020