Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Dmitriy Drusvyatskiy

Department of Mathematics, University of Washington

Optimization for large-scale learning: beyond smoothness and convexity

Abstract:

Estimation and learning algorithms are dramatically improving our capacity to extract information from massive datasets, with impressive consequences for technology and society at large. Although these algorithms have had widespread empirical success, we have yet to find a coherent mathematical foundation that can explain why these algorithms succeed on such a wide array of problems. The challenge is that the two assumptions that underpin classical optimization theory---smoothness and convexity---rarely hold in contemporary settings. Nonetheless, simple optimization algorithms often do succeed, and over the last few years, I have studied when and why this happens. In this talk, I will survey some recent work in this area covering optimization theory, algorithms, and applications in signal processing and machine learning. In the process, we will encounter a surprisingly rich array of mathematical tools spanning nonsmooth analysis, semi-algebraic geometry, and high dimensional probability and statistics.

 

January 11, 2024

3:00 PM

APM 6402

****************************