2025 Theses Doctoral
Bounded subgradient trajectories in semialgebraic optimization
Solving modern data science problems relies on optimization algorithms that often succeed in high-dimensional, large-scale settings. First-order methods, in particular, are widely used due to their low per-iteration cost and scalability. However, despite their empirical success, theoretical understanding of their behavior in such settings remains limited.
Classical optimization theory is typically built on assumptions about the objective function, such as convexity, smoothness, and coercivity, which are rarely satisfied in practice. In addition, common assumptions on algorithms, such as the boundedness of iterates or the existence of limit points, are often difficult to verify. To address these challenges, we develop a framework that replaces these assumptions with easily checkable conditions, using tools from dynamical systems, semialgebraic geometry, and variational analysis.
A key result of this thesis is that a broad class of optimization problems, including phase retrieval, matrix sensing, and neural networks, have bounded gradient flows. This property is central to multiple aspects of optimization. For landscape analysis, we develop practical tools for certifying the absence of bad local minima at infinity. For first-order algorithms, we analyze momentum methods and the proximal random reshuffling algorithm, proving global convergence of iterates and establishing improved convergence rates.
Subjects
Files
-
Li_columbia_0054D_19197.pdf
application/pdf
1.49 MB
Download File
More About This Work
- Academic Units
- Industrial Engineering and Operations Research
- Thesis Advisors
- Josz, Cédric
- Degree
- Ph.D., Columbia University
- Published Here
- May 21, 2025