# Convergence

##### New Member
If $Z_n = \frac{1}{n}*\sum_{i=1}^nX_i^2$ where $X_i=N(\mu,\sigma^2)$

1) Show that $Z_n \rightarrow^p \mu^2+\sigma^2$(convergence in probability)

I no have idea how to develop that
$\lim_{n \rightarrow \infty} P(|Z_n-(\mu^2+\sigma^2|>\epsilon)=0$

##### New Member
Look, I know that $\frac{1}{n}\sum_{i=1}^nX_i^2 = E[X^2]$ is the second sample moment, and $E[X^2]= \mu^2+\sigma^2$ so for rth-mean convergence, I have $\lim_{n \rightarrow \infty}E[|Z_n-(\mu^2+\sigma^2)|]=0$, and if converges in rth-mean implies that converges in probability.

The other idea is
$\lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)=\lim_{n \rightarrow \infty}P(|E[X^2]-(\mu^2+\sigma^2)|\geq\epsilon)$$=\lim_{n \rightarrow \infty}P(\mu^2+\sigma^2-(\mu^2+\sigma^2)\geq\epsilon)=\lim_{n \rightarrow \infty}P(0\geq\epsilon)\rightarrow0$

some of the ideas are right?

Last edited:

#### BGM

##### TS Contributor
This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.

##### New Member
This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.
What did I do wrong?

Last edited:

#### BGM

##### TS Contributor
The first idea is alright - if you already have the result of convergence in mean.
i.e. Are you sure you can directly use the result

$\lim_{n \to +\infty}E[|Z_n - (\mu + \sigma^2)|] = 0$

without proving it?

For the second idea, the step

$\lim_{n\to+\infty} \Pr\{|Z_n - (\mu + \sigma^2)| \geq \epsilon\} = \lim_{n\to+\infty} \Pr\{|E[Z_n] - (\mu + \sigma^2)| \geq \epsilon\}$

is wrong. As a quick check: the RHS is independent of the limit $n$, and as you see there is no random variable inside the probability and it is actually equal to zero.

##### New Member
$\lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}E[Z_n-(\mu^2+\sigma^2|]=\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])$
where $E[Z_n]=E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}E[\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=E[X_1^2]=\mu^2+\sigma^2$

so

$\lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])=\frac{1}{\epsilon}((\mu^2+\sigma^2)-(\mu^2+\sigma^2))$$=\frac{1}{\epsilon}(0)\rightarrow0$

This is right?
Anyway it is a particular case of chebyshev's inequality?
$P(|X-a|\geq\epsilon)\leq\frac{1}{\epsilon^p}E[|X-a|^p]$

Last edited:

#### BGM

##### TS Contributor
Note that when you apply the Markov inequality, it require non-negative random variable. So you cannot apply on the random variable like $X - E[X]$.

##### New Member
My last try

If $S_n=\sum_{i=1}^nX_i^2$ and $E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=\mu^2+\sigma^2$, so

$P(|Z_n-(\mu^2+\sigma^2|\geq\epsilon)=P(|\frac{S_n-E[S_n]|}{n}\geq\epsilon)=P(S_n-E[S_n])\geq n\epsilon)\leq\frac{Var(S_n)}{n^2\epsilon^2}\rightarrow0$ when $n\rightarrow\infty$

So $Z_n \rightarrow^p \mu^2+\sigma^2$

Is that right? If not please solve for me, because I do not know what to do

Last edited:

##### New Member
Note that when you apply the Markov inequality, it require non-negative random variable. So you cannot apply on the random variable like $X - E[X]$.
You can verify that this is correct?

#### BGM

##### TS Contributor
Almost done. But you should further show that the term $\frac {Var[S_n]} {n^2}$ indeed converges to 0, since at this point $Var[S_n]$ is still depends on $n$

Almost done. But you should further show that the term $\frac {Var[S_n]} {n^2}$ indeed converges to 0, since at this point $Var[S_n]$ is still depends on $n$