##### Department of Mathematics,

University of California San Diego

****************************

### Math 278C - Optimization Seminar and Data Science

## Wenxin Zhou

#### UCSD

## Robust Estimation and Inference via Multiplier Bootstrap

##### Abstract:

Massive data are often contaminated by outliers and heavy-tailed errors. In the presence of heavy-tailed data, finite sample properties of the least squares-based methods, typified by the sample mean, are suboptimal both theoretically and empirically. To address this challenge, we propose the adaptive Huber regression for robust estimation and inference. The key observation is that the robustification parameter should adapt to sample size, dimension and moments for optimal tradeoff between bias and robustness. For heavy-tailed data with bounded $(1+\delta)$-th moment for some $\delta>0$, we establish a sharp phase transition for robust estimation of regression parameters in both finite dimensional and high dimensional settings: when $\delta \geq 1$, the estimator achieves sub-Gaussian rate of convergence without sub-Gaussian assumptions, while only a slower rate is available in the regime $0<\delta <1$ and the transition is smooth and optimal. In addition, non-asymptotic Bahadur representation and Wilksâ€™ expansion for finite sample inference are derived when higher moments exist. Based on these results, we make a further step on developing uncertainty quantification methodologies, including the construction of confidence sets and large-scale simultaneous hypothesis testing. We demonstrate that the adaptive Huber regression, combined with the multiplier bootstrap procedure, provides a useful robust alternative to the method of least squares. Together, the theoretical and empirical results reveal the effectiveness of the proposed method, and highlight the importance of having statistical methods that are robust to violations of the assumptions underlying their use.

Jiawang Nie

### October 18, 2017

### 4:00 PM

### AP&M 2402

****************************