Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278C - Optimization and Data Science

Dmitriy Drusvyatskiy

University of Washington

Optimization algorithms beyond smoothness and convexity

Abstract:

Stochastic iterative methods lie at the core of large-scale optimization and its modern applications to data science. Though such algorithms are routinely and successfully used in practice on highly irregular problems (e.g. deep neural networks), few performance guarantees are available outside of smooth or convex settings. In this talk, I will describe a framework for designing and analyzing stochastic gradient-type methods on a large class of nonsmooth and nonconvex problems. The problem class subsumes such important tasks as matrix completion, robust PCA, and minimization of risk measures, while the methods include stochastic subgradient, Gauss-Newton, and proximal point iterations. I will describe a number of results, including finite-time efficiency estimates, avoidance of extraneous saddle points, and asymptotic normality of averaged iterates.

Host: Jiawang Nie

April 6, 2022

3:00 PM

https://ucsd.zoom.us/j/93696624146

Meeting ID: 936 9662 4146
Password: OPT2022SP

****************************