Stochastic and symbols are the basic symbols for Asymptotic Statistics or Large Sample Theory.

**(i)** : if .

sequence of random variables is of smaller order in probability than a sequence .

In particular, , “small oh-P-one”, if and only if ; so means and .

*Example:* means , and means , or goes to faster than in probability(such as ).

**(ii)** : if given , there exists a constant and an integer such that for all .

sequence is be of order less than or equal to that of in probability.

In particular, , “big oh-P-one”, if for any, there exists a constant and an integer such that for all , is said to be bounded in probability(or tight); so means and .

It’s easy to see from the definition that for any constant .

**(iii)** : if given , there exist constants and an integer such that for all .

sequence is said to be of the same order as in probability.

**Some facts:**

: If and , then . (example of continuous-mapping theorem)

: If the sequence is bounded in probability and if is a sequence of random variables tending to 0 in probability, then .

**Lemma:** Let be a function defined on domain in such that . Let be a sequence of random vectors with values in the domain of that converges in probability to zero. Then, for every ,

(i) if as , then ;

(ii) if as , then ;

**Result:** For a random variable , .

Proof:

We only needs to prove that or equally, for any, there exists a constant and an integer such that for all .

Let ,

According to Markov inequality, as .

From the proof above we know that for any normalized random variable , we have , or is bounded in probability- the reason is natural, if any random variable is not bounded, either its mean is too large() or it varies too much(), and normalization will eliminate those two possibilities. On the other hand, for a specified random variable , if and , then , especially, when , .

*Example:* from center limit theorem we know that , then we have

~~////,~~

~~////.~~

. can be smaller than as long as is large enough, so , or .

The weak law of large numbers states that , so we have

.

(Update: 2012/Feb/17) Similarly, let be a sequence of random vectors, using Markov inequality , we have

1. If there is a number such that is bounded, then ;

similarly, if , where is a constant and is a sequence of positive numbers,

then .

2. If there is a number such that (So can be ), then ;

similarly, if , where is a constant and is a sequence of positive numbers,

then for any sequence such that .

3. If there are sequences of vectors and singularization matrices such that converges in distribution, then .

**References:**

Elements of Large-Sample Theory, E.L. Lehmann, 1998

Asymptotic Statistics, A. W. van der Vaart, 2000

Linear and Generalized Linear Mixed Models and Their Applications, Jiming Jiang, 2006