Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278B- Mathematics of Information, Data, and Signals Seminar

Roberto Imbuzeiro Oliveira

IMPA, Rio de Janeiro

Sample average approximation with heavier tails

Abstract:

Consider an ``ideal" optimization problem where constraints and objective function are defined in terms of expectations over some distribution P. The sample average approximation (SAA) -- a fundamental idea in stochastic optimization -- consists of replacing the expectations by an average over a sample from P. A key question is how much the solutions of the SAA differ from those of the original problem. Results by Shapiro from many years ago consider what happens asymptotically when the sample size diverges, especially when the solution of the ideal problem lies on the boundary of the feasible set. In joint work with Philip Thompson (Purdue), we consider what happens with finite samples. As we will see, our results improve upon the nonasymptotic state of the art in various ways: we allow for heavier tails, unbounded feasible sets, and obtain bounds that (in favorable cases) only depend on the geometry of the feasible set in a small neighborhood of the optimal solution. Our results combine ``localization" and ``fixed-point" type arguments inpired by the work of Mendelson with chaining-type inequalities. One of our contributions is showing what can be said when the SAA constraints are random.

Host: Rayan Saab

March 18, 2021

11:30 AM

Zoom link: https://msu.zoom.us/j/96421373881 (passcode: first prime number $>$ 100)

****************************