Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Center for Computational Mathematics Seminar

Jeb Runnoe

UCSD

Minimum-norm Perturbations and Regularization in Modified Newton Methods for Unconstrained Optimization

Abstract:

 Modified Newton methods are designed to extend the desirable properties of classical Newton method to a wider class of optimization problems. If the Hessian of the objective function is singular at the solution, these methods tend to behave like gradient descent and rapid local convergence is lost. An adaptive regularization technique is described that yields a modified Newton method that retains superlinear local convergence on non-convex problems without the nonsingularity assumption at the solution. The minimum norm perturbation and symmetric indefinite factorization used to construct a sufficiently positive definite approximate Hessian are discussed, and numerical results comparing regularized and standard modified Newton methods will be presented. Lastly, a well-behaved pathological example will be used to illustrate an assumption required for superlinear convergence.

March 14, 2023

11:00 AM

APM 2402 and Zoom ID 994 0149 1091

****************************