Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Center for Computational Mathematics Seminar

Shu Liu

UCLA

Natural Primal-Dual Gradient Method for Adversarial Neural Network Training on Solving Partial Differential Equations

Abstract:

We propose a scalable preconditioned primal-dual gradient algorithm for solving partial differential equations (PDEs). We multiply the PDE with a dual function to obtain an inf-sup problem whose loss function involves lower-order differential operators. The Primal-Dual hybrid gradient (PDHG) algorithm is leveraged for this saddle point problem. By introducing suitable precondition operators to the proximal steps in the PDHG algorithm, we obtain an alternative natural gradient ascent-descent optimization scheme for updating the primal and the adversarial neural network parameters. We apply the Krylov subspace method to evaluate the natural gradients efficiently. A posterior convergence analysis is established for the time-continuous version of the proposed method.

The algorithm is tested on various types of linear and nonlinear PDEs with dimensions ranging from 1 to 50. We compare the performance of the proposed method with several commonly used deep learning algorithms, such as physics-informed neural networks (PINNs), the DeepRitz method, and weak adversarial networks (WANs) using the Adam or L-BFGS optimizers. The numerical results suggest that the proposed method performs efficiently and robustly and converges more stably.

November 12, 2024

11:00 AM

APM 2402 & Zoom ID 921 2618 5194

****************************