Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278C: Optimization and Data Science

Prof. Alexander Strang

UC Berkeley

Solution Continuation Methods for Bayesian Estimation and Sampling

Abstract:

 

Bayesian estimation and uncertainty quantification depend on prior assumptions. These assumptions are often chosen to promote specific features in the recovered estimate like sparsity. The form of the chosen prior determines the shape of the posterior distribution, thus the behavior of the estimator, and the complexity of the associated optimization and sampling problems. Here, we consider a family of Gaussian hierarchical models with generalized gamma hyperpriors designed to promote sparsity in linear inverse problems. By varying the hyperparameters we can move continuously between priors that act as smoothed ℓp penalties with flexible p, smoothing, and scale. We introduce methods for efficiently tracking MAP solutions along paths through hyperparameter space. Path following allows a user to explore the space of possible estimators under varying assumptions and to test the robustness and sensitivity of solutions to changes in the prior assumptions. By tracing paths from a convex region to a non-convex region, the user can find local minimizers in non-convex, strongly sparsity-promoting regimes that are consistent with a convex relaxation drawn from the same family of posteriors. We show experimentally that these solutions are less error-prone than direct optimization of the non-convex problem. The same relaxation approach allows sampling from highly non-convex multi-modal posteriors in high dimension via a variational Bayesian formulation. We demonstrate predictor-corrector methods for estimator and sample continuation.

Host: Jiawang Nie

January 24, 2024

3:00 PM

APM 7321

****************************