Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278C: Optimization and Data Science

Prof. Long Chen

UC Irvine

Accelerated Gradient Methods through Variable and Operator Splitting

Abstract:

In this talk, we present a unified framework for accelerated gradient methods through the variable and operator splitting. The operator splitting decouples the optimization process into simpler subproblems, and more importantly, the variable splitting leads to acceleration. The key contributions include the development of strong Lyapunov functions to analyze stability and convergence rates, as well as advanced discretization techniques like Accelerated Over-Relaxation (AOR) and extrapolation by the predictor-corrector (EPC) methods. The framework effectively handles a wide range of optimization problems, including convex problems, composite convex optimization, and saddle point systems with bilinear coupling. A dynamic updating parameter, which serves as a rescaling of time, is introduced to handle the weak convex cases.

Host: Jiawang Nie

February 5, 2025

4:00 PM

APM 7218 & Zoom - Meeting ID: 941 4642 0185, Password: 278C2025

Research Areas

Optimization

****************************