Convergence in probability

askazy

New Member
#1
If \( Z_n=\frac{1}{n}\sum_{i=1}^nX_i^2\) and \(W_n=\frac{1}{n}\sum_{i=1}^n(X_i-\bar{X})^2\)

Show that \(V_n=Z_n - W_n \rightarrow^p \mu^2\)

What I think:
\(Z_n-W_n=\frac{1}{n}\sum_{i=1}^nX_i^2-(X_i-\bar{X})^2\), so \((X_i-\bar{X})^2=X_i^2-2X_i\bar{X}+\bar{X}^2\)

now

\(Z_n-W_n=\frac{1}{n}\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2\) doing \(S_n=\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2 \rightarrow Z_n-W_n=\frac{S_n}{n}\)
\(E[S_n]=E[\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2]\) assuming \(X_i\) and \(\bar{X}\) are independent.
\(E[S_n]=2n*E[X_1]E[\bar{X}]-n*E[\bar{X}]^2=2n*\mu*\mu-n\mu^2=n\mu^2\) now applying chebyshev's
\(P(|Z_n-W_n)-\mu^2|\geq\epsilon)=P(|\frac{S_n-E[S_n]}{n}|\geq\epsilon)=P(S_n-E[S_n]\geq n\epsilon)\leq\frac{Var(S_n)}{n^2\epsilon^2}\rightarrow0\)
Finally
\(V_n=Z_n - W_n \rightarrow^p \mu^2\)
 

BGM

TS Contributor
#2
I guess you cannot assume \( X_i \) and \( \bar{X} \) are independent. One crucial thing here which you may missed is that

\( Z_n - W_n = \bar{X}^2 \)

after some simplifications.
 

askazy

New Member
#3
I guess you cannot assume \( X_i \) and \( \bar{X} \) are independent. One crucial thing here which you may missed is that

\( Z_n - W_n = \bar{X}^2 \)

after some simplifications.
I cannot see how \( Z_n - W_n = \bar{X}^2 \)
 

Dason

Ambassador to the humans
#4
I cannot see how \( Z_n - W_n = \bar{X}^2 \)
I'll provide you some assurance that the equality in reference does in fact hold. Write everything out in summation form, expand it all out, cancel the terms that you can and things will simplify. One thing to keep in mind

\(
\sum_{i=1}^n X_i = n\bar{X}\)

it's just a very simple change to the the definition of \(\bar{X}\) but it's useful to keep in mind when dealing with these summation problems.
 

askazy

New Member
#5
I'll provide you some assurance that the equality in reference does in fact hold. Write everything out in summation form, expand it all out, cancel the terms that you can and things will simplify. One thing to keep in mind

\(
\sum_{i=1}^n X_i = n\bar{X}\)

it's just a very simple change to the the definition of \(\bar{X}\) but it's useful to keep in mind when dealing with these summation problems.
Yeah, I can see it now. But I wonder if the way applied to Chebyshev's inequality is correct.

Man I'm really confused now
\(E[\bar{X}^2]\) or \(E[\bar{X}]^2\)
 
Last edited:

askazy

New Member
#6
I guess you cannot assume \( X_i \) and \( \bar{X} \) are independent. One crucial thing here which you may missed is that

\( Z_n - W_n = \bar{X}^2 \)

after some simplifications.
Anyway, is right?
\(E[\bar{X}]=\mu\rightarrow E[\bar{X}]^2=\mu^2\)
\(Var(\bar{X})=\frac{\sigma^2}{n}\rightarrow Var(\bar{X})^2=\frac{\sigma^4}{n^2}\)

\(0\leq P(|\bar{X}^2-\mu^2|\geq\epsilon)\leq\frac{Var(\bar{X})^2}{\epsilon^2}\rightarrow0\)
 

BGM

TS Contributor
#7
The Chebyshev inequality is:

\( \Pr\{|X - E[X]| \geq kSD[X]\} \leq \frac {1} {k^2} \)

where \( k > 0 \). In particular we put \( k = \frac {\epsilon} {SD[X]} \), then we have

\( \Pr\{|X - E[X]| \geq \epsilon\} \leq \frac {Var[X]} {\epsilon^2} \)


From your work I think you have applied it correctly. The spirit of this useful bound is that if you want to show a sequence of random variable is converging towards to its common mean, then then you just need to show the variance is converging to zero.

The tricky part here is that

1. the calculation of the variance can be quite tedious
2. the mean of \( \bar{X}_n^2 \) is a sequence of \( n \), converging to \( \mu^2 \) but not exactly equal to \( \mu^2 \) (\( \bar{X}_n^2 \) is asymptotically unbiased, but biased estimator of \( \mu^2 \)). So you may need to have a little adjustment before applying it.