Printable PDF
Department of Mathematics,
University of California San Diego


Final Defense

Yunyi Zhang

Regression with complex data: regularization, prediction and bootstrap


Analyzing a linear model is a fundamental topic in statistical inference and has been well-studied. However, the complex nature of modern data brings new challenges to statisticians, i.e., the existing theories and methods may fail to provide consistent results. Focusing on a high dimensional linear model with i.i.d. errors or heteroskedastic and dependent errors, this talk introduces a new ridge regression method called `the debiased and thresholded ridge regression' that fits the linear model. After that, it introduces new bootstrap algorithms that generate consistent simultaneous confidence intervals/performs hypothesis testing for the linear model. This talk also applies bootstrap algorithm to construct the simultaneous prediction intervals for future observations. 

Another topic of this talk is about properties of a residual-based bootstrap prediction interval. It derives the asymptotic distribution of the difference between the conditional coverage probability of a nominal prediction interval and the conditional coverage probability of a prediction interval obtained via a residual-based bootstrap. This result shows that the residual-based bootstrap prediction interval has about $50\%$ possibility of yielding conditional under-coverage. Moreover, it introduces a new bootstrap prediction interval that has the desired asymptotic conditional coverage probability and the possibility of conditional under-coverage.

Advisor: Dimitris N. Politis

May 27, 2022

5:00 PM

Zoom ID: 657 026 0290

Research Areas