Printable PDF
Department of Mathematics,
University of California San Diego

****************************

2009 Southern California Optimization Day

Tom Bewley

UCSD

Incorporating Regular Lattices and Accounting for Approximate Function Evaluations in Derivative-Free Optimization

Abstract:

Systems characterized by expensive, nonconvex, noisy functions in moderate dimensions (n=2 to 24) necessitate the development of maximally efficient derivative-free optimization algorithms. Starting with the well-known Surrogate Management Framework (SMF), our lab has developed a new, highly efficient derivative-free optimization algorithm, which we dub LAttice-Based Derivative-free Optimization via Global Surrogates (LABDOGS). This algorithm combines a highly efficient, globally convergent surrogate-based Search algorithm with an efficient Poll which incorporates a minimum number of new function evaluations chosen from nearest-neighbor points. All function evaluations are coordinated with highly uniform noncartesian lattices derived from n-dimensional sphere packing theory. Representative numerical tests confirm significant improvements in convergence of lattice-based strategies as compared with otherwise identical codes coordinated using Cartesian grids. The second topic of our talk focuses on incorporating approximate function evaluations into a surrogate-based optimization scheme. Assuming the accuracy of each function evaluation in a statistical setting diminishes towards zero in proportion with the reciprocal of the square root of the sample time, we have developed an algorithm for sampling the function only as accurately as warranted. The algorithm we have developed, dubbed $\alpha$-DOGS, maintains the globally convergent behavior of the LABDOGS Search while focusing the bulk of the computation time on regions of parameter space where the existing approximate function evaluations indicate that the true function minimum might lie.

March 19, 2009

10:00 AM

AP&M 6402

****************************