Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Center for Computational Mathematics Seminar

Dmitriy Drusvyatskiy

University of Washington

Optimization Algorithms Beyond Smoothness and Convexity

Abstract:

Stochastic iterative methods lie at the core of large-scale optimization and its modern applications to data science. Though such algorithms are routinely and successfully used in practice on highly irregular problems (e.g., deep learning), few performance guarantees are available outside of smooth or convex settings. In this talk, I will describe a framework for designing and analyzing stochastic gradient-type methods on a large class of nonsmooth and nonconvex problems. The problem class subsumes such important tasks as matrix completion, robust PCA, and minimization of risk measures, while the methods include stochastic subgradient, Gauss Newton, and proximal point iterations. I will describe a number of results, including finite time efficiency estimates, avoidance of extraneous saddle points, and asymptotic normality of averaged iterates.

May 10, 2022

11:00 AM

Zoom ID 954 6624 3503

****************************