##### Department of Mathematics,

University of California San Diego

****************************

### Math 278C - Optimization seminar and Data Science

## Shahrouz R. Alimo

#### UC San Diego

## Delta-DOGS: efficient new data-driven global optimization approaches

##### Abstract:

Alongside derivative-based methods, which scale better to higher-dimensional problems, derivative-free methods play an essential role in the optimization of many practical engineering systems, especially those in which function evaluations are determined by statistical averaging, and those for which the function of interest is nonconvex in the adjustable parameters. This talk focuses on the development of a new family of surrogate-based derivative-free optimization schemes, namely Delta-DOGS schemes. The idea unifying this efficient and (under the appropriate assumptions) provably-globally-convergent family of schemes is the minimization of a search function which linearly combines a computationally inexpensive ''surrogate`` (that is, an interpolation or in some cases a regression, of recent function evaluations - we generally favor some variant of polyharmonic splines for this purpose), to summarize the trends evident in the data available thus far, with a synthetic piecewise-quadratic ''uncertainty function`` (built on the framework of a Delaunay triangulation of existing datapoints), to characterize the reliability of the surrogate by quantifying the distance of any given point in parameter space to the nearest function evaluations. This talk introduces a handful of new schemes in the Delta-DOGS family: (a) Delta-DOGS(Omega) designs for nonconvex (even, disconnected) feasible domains defined by computable constraint functions within a bound search domain. (b) Delta-DOGS(Lambda) accelerates the convergence of Delta-DOGS family by restricting function evaluations at each iteration to lie on a dense lattice (derived from an n-dimensional sphere packing) in a linear constraint search domain. The lattice size is successively refined as convergence is approached. (c) gradient-based acceleration of Delta-DOGS combines derivative-free global exploration with derivative-based local refinement. (d) alpha-DOGSX designs to simultaneously increase the sampling time, and refine the numerical approximation, as convergence is approached. This talk also introduces a method to scale the parameter domain under consideration based on the adaptive variation of the seen data in the optimization process, thereby obtaining a significantly smoother surrogate. This method is called the Multivariate Adaptive Polyharmonic Splines (MAPS) surrogate model. The judicious use of MAPS to identify variation of the objective function over the parameter space in some of the iterations results in neglecting the less significant parameters, thereby speeding up convergence rate. These algorithms have been compared with existing state-of-the-art algorithms, particularly the Surrogate Management Framework (SMF) using the Kriging model and Mesh Adaptive Direct Search (MADS), on both standard synthetic and computer-aided shape designs such as the design of airfoils and hydrofoils. We showed that in most cases, the new Delta-DOGS algorithms outperform the existing ones.

Host: Jiawang Nie

### June 7, 2017

### 4:00 PM

### AP&M 5402

****************************