Convergence

askazy

New Member
#1
If [math] Z_n = \frac{1}{n}*\sum_{i=1}^nX_i^2 [/math] where [math]X_i=N(\mu,\sigma^2)[/math]

1) Show that [math]Z_n \rightarrow^p \mu^2+\sigma^2[/math](convergence in probability)

I no have idea how to develop that
[math]\lim_{n \rightarrow \infty} P(|Z_n-(\mu^2+\sigma^2|>\epsilon)=0[/math]
 

askazy

New Member
#2
Look, I know that [math]\frac{1}{n}\sum_{i=1}^nX_i^2 = E[X^2][/math] is the second sample moment, and [math]E[X^2]= \mu^2+\sigma^2[/math] so for rth-mean convergence, I have [math] \lim_{n \rightarrow \infty}E[|Z_n-(\mu^2+\sigma^2)|]=0[/math], and if converges in rth-mean implies that converges in probability.

The other idea is
[math]\lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)=\lim_{n \rightarrow \infty}P(|E[X^2]-(\mu^2+\sigma^2)|\geq\epsilon)[/math][math]=\lim_{n \rightarrow \infty}P(\mu^2+\sigma^2-(\mu^2+\sigma^2)\geq\epsilon)=\lim_{n \rightarrow \infty}P(0\geq\epsilon)\rightarrow0[/math]

some of the ideas are right?
 
Last edited:

BGM

TS Contributor
#3
This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.
 

askazy

New Member
#4
This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.
What did I do wrong?
 
Last edited:

BGM

TS Contributor
#5
The first idea is alright - if you already have the result of convergence in mean.
i.e. Are you sure you can directly use the result

[math] \lim_{n \to +\infty}E[|Z_n - (\mu + \sigma^2)|] = 0 [/math]

without proving it?


For the second idea, the step

[math] \lim_{n\to+\infty} \Pr\{|Z_n - (\mu + \sigma^2)| \geq \epsilon\}
= \lim_{n\to+\infty} \Pr\{|E[Z_n] - (\mu + \sigma^2)| \geq \epsilon\} [/math]

is wrong. As a quick check: the RHS is independent of the limit [math] n [/math], and as you see there is no random variable inside the probability and it is actually equal to zero.
 

askazy

New Member
#6
[math] \lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}E[Z_n-(\mu^2+\sigma^2|]=\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])[/math]
where [math]E[Z_n]=E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}E[\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=E[X_1^2]=\mu^2+\sigma^2[/math]

so

[math] \lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])=\frac{1}{\epsilon}((\mu^2+\sigma^2)-(\mu^2+\sigma^2))[/math][math]=\frac{1}{\epsilon}(0)\rightarrow0[/math]

This is right?
Anyway it is a particular case of chebyshev's inequality?
[math]P(|X-a|\geq\epsilon)\leq\frac{1}{\epsilon^p}E[|X-a|^p][/math]
 
Last edited:

BGM

TS Contributor
#8
Note that when you apply the Markov inequality, it require non-negative random variable. So you cannot apply on the random variable like [math] X - E[X] [/math].
 

askazy

New Member
#9
My last try

If [math]S_n=\sum_{i=1}^nX_i^2[/math] and [math]E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=\mu^2+\sigma^2[/math], so

[math]P(|Z_n-(\mu^2+\sigma^2|\geq\epsilon)=P(|\frac{S_n-E[S_n]|}{n}\geq\epsilon)=P(S_n-E[S_n])\geq n\epsilon)\leq\frac{Var(S_n)}{n^2\epsilon^2}\rightarrow0[/math] when [math]n\rightarrow\infty[/math]

So [math]Z_n \rightarrow^p \mu^2+\sigma^2[/math]

Is that right? If not please solve for me, because I do not know what to do
 
Last edited:

BGM

TS Contributor
#12
Almost done. But you should further show that the term [math] \frac {Var[S_n]} {n^2} [/math] indeed converges to 0, since at this point [math] Var[S_n] [/math] is still depends on [math] n [/math]