Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math Colloquium

Tianhao Wang

Yale University

Algorithm Dynamics in Modern Statistical Learning: Universality and Implicit Regularization

Abstract:

Modern statistical learning is featured by the high-dimensional nature of data and over-parameterization of models. In this regime, analyzing the dynamics of the used algorithms is challenging but crucial for understanding the performance of learned models. This talk will present recent results on the dynamics of two pivotal algorithms: Approximate Message Passing (AMP) and Stochastic Gradient Descent (SGD). Specifically, AMP refers to a class of iterative algorithms for solving large-scale statistical problems, whose dynamics admit asymptotically a simple but exact description known as state evolution. We will demonstrate the universality of AMP's state evolution over large classes of random matrices, and provide illustrative examples of applications of our universality results. Secondly, for SGD, a workhorse for training deep neural networks, we will introduce a novel mathematical framework for analyzing its implicit regularization. This is essential for SGD's ability to find solutions with strong generalization performance, particularly in the case of over-parameterization. Our framework offers a general method to characterize the implicit regularization induced by gradient noise. Finally, in the context of underdetermined linear regression, we will show that both AMP and SGD can provably achieve sparse recovery, yet they do so from markedly different perspectives.

Host: Ery Arias-Castro

December 7, 2023

3:00 PM

APM 6402

****************************