Department of Mathematics,
University of California San Diego
****************************
Murray and Adylin Rosenblatt Endowed Lectures in Applied Mathematics
Professor Peter Bartlett
UC Berkeley
Modern machine learning methods: large step-size optimization, implicit bias, and benign overfitting
Abstract:
The impressive performance of modern machine learning methods seems to arise through different mechanisms from those of classical statistical learning theory, mathematical statistics, and optimization theory. Simple gradient methods find excellent solutions to non-convex optimization problems, and without any explicit effort to control model complexity they exhibit excellent prediction performance in practice. This talk will describe recent progress in statistical learning theory and optimization theory that demonstrates the optimization benefits of step-sizes that are too large to allow gradient methods to be viewed as an accurate time discretization of a gradient flow differential equation, that characterizes the solutions that are favored by gradient optimization methods, and that illustrates when those solutions can overfit training data but still provide good predictive accuracy.
May 11, 2026
3:00 PM
Kavli Auditorium, Tata Hall, UCSD
****************************

