Printable PDF
##### Department of Mathematics, University of California San Diego

****************************

## On asymptotic proximity of probability distributions and the non-classical invariance principle

##### Abstract:

Usually, a limit theorem of Probability Theory is a theorem that concerns convergence of a sequence of distributions $P_n$ to a distribution $P$. However, there is a number of works where the traditional setup is modified, and the object of study is two sequences of distributions, $P_n$ and $Q_n$, and the goal consists in establishing conditions implying the convergence $P_n - Q_n ->0 (1)$ In particular problems,$P_n$ and $Q_n$ are, as a rule, the distributions of the r.v.'s $f(X_1,...,X_n)$ and $f(Y_1,...,Y_n)$, where $f(.)$ is a function, and $X_1,X_2$,... and $Y_1,Y_2$,... are two sequences of r.v.'s. The aim here is rather to show that different random arguments $X_1,...,X_n$ may generate close distributions of $f(X_1,...,X_n)$ , than to prove that the distribution of $f(X_1,...,X_n)$ is close to some fixed distribution (which, above else, may be not true). Clearly, such a framework is more general than the traditional one. First, as was mentioned, the distributions $P_n$ and $Q_n$, themselves do not have to converge. Secondly, the sequences $P_n$ and $Q_n$ are not assumed to be tight, and the convergence in $(1)$ covers situations when a part of the probability mass or the whole distributions "move away to infinity'", while the distributions $P_n$ and $Q_n$, are approaching each other. We consider a theory on this point, including the very definition of convergence $(1)$, and a particular example of the invariance principle in the general non-classical setup.

Host: Ruth Williams

### AP&M 6402

****************************