+ Reply to Thread
Results 1 to 13 of 13

Thread: Convergence

  1. #1
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Convergence




    If Z_n = \frac{1}{n}*\sum_{i=1}^nX_i^2 where X_i=N(\mu,\sigma^2)

    1) Show that Z_n \rightarrow^p \mu^2+\sigma^2(convergence in probability)

    I no have idea how to develop that
    \lim_{n \rightarrow \infty} P(|Z_n-(\mu^2+\sigma^2|>\epsilon)=0

  2. #2
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    Look, I know that \frac{1}{n}\sum_{i=1}^nX_i^2 = E[X^2] is the second sample moment, and E[X^2]= \mu^2+\sigma^2 so for rth-mean convergence, I have \lim_{n \rightarrow \infty}E[|Z_n-(\mu^2+\sigma^2)|]=0, and if converges in rth-mean implies that converges in probability.

    The other idea is
    \lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)=\lim_{n \rightarrow \infty}P(|E[X^2]-(\mu^2+\sigma^2)|\geq\epsilon)=\lim_{n \rightarrow \infty}P(\mu^2+\sigma^2-(\mu^2+\sigma^2)\geq\epsilon)=\lim_{n \rightarrow \infty}P(0\geq\epsilon)\rightarrow0

    some of the ideas are right?
    Last edited by askazy; 12-03-2014 at 05:32 PM.

  3. #3
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence

    This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.

  4. #4
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    Quote Originally Posted by BGM View Post
    This is just a particular example of Weak Law of Large Number, which can be proved typically by the application of Chebyshev's inequality. In this case you can apply this inequality because all (4th) momemts of a normal random variable exists.
    What did I do wrong?
    Last edited by askazy; 12-03-2014 at 06:11 PM.

  5. #5
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence

    The first idea is alright - if you already have the result of convergence in mean.
    i.e. Are you sure you can directly use the result

    \lim_{n \to +\infty}E[|Z_n - (\mu + \sigma^2)|] = 0

    without proving it?


    For the second idea, the step

    \lim_{n\to+\infty} \Pr\{|Z_n - (\mu + \sigma^2)| \geq \epsilon\}
= \lim_{n\to+\infty} \Pr\{|E[Z_n] - (\mu + \sigma^2)| \geq \epsilon\}

    is wrong. As a quick check: the RHS is independent of the limit n, and as you see there is no random variable inside the probability and it is actually equal to zero.

  6. #6
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    \lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}E[Z_n-(\mu^2+\sigma^2|]=\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])
    where E[Z_n]=E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}E[\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=E[X_1^2]=\mu^2+\sigma^2

    so

    \lim_{n \rightarrow \infty}P(|Z_n-(\mu^2+\sigma^2)|\geq\epsilon)\leq\frac{1}{\epsilon}(E[Z_n]-E[\sigma^2+\mu^2])=\frac{1}{\epsilon}((\mu^2+\sigma^2)-(\mu^2+\sigma^2))=\frac{1}{\epsilon}(0)\rightarrow0

    This is right?
    Anyway it is a particular case of chebyshev's inequality?
    P(|X-a|\geq\epsilon)\leq\frac{1}{\epsilon^p}E[|X-a|^p]
    Last edited by askazy; 12-04-2014 at 06:49 AM.

  7. #7
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    Help please?

  8. #8
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence

    Note that when you apply the Markov inequality, it require non-negative random variable. So you cannot apply on the random variable like X - E[X].

  9. #9
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    My last try

    If S_n=\sum_{i=1}^nX_i^2 and E[\frac{1}{n}\sum_{i=1}^nX_i^2]=\frac{1}{n}*nE[X_1^2]=\mu^2+\sigma^2, so

    P(|Z_n-(\mu^2+\sigma^2|\geq\epsilon)=P(|\frac{S_n-E[S_n]|}{n}\geq\epsilon)=P(S_n-E[S_n])\geq n\epsilon)\leq\frac{Var(S_n)}{n^2\epsilon^2}\rightarrow0 when n\rightarrow\infty

    So Z_n \rightarrow^p \mu^2+\sigma^2

    Is that right? If not please solve for me, because I do not know what to do
    Last edited by askazy; 12-04-2014 at 03:33 PM.

  10. #10
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    bump for help.

  11. #11
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence

    Quote Originally Posted by BGM View Post
    Note that when you apply the Markov inequality, it require non-negative random variable. So you cannot apply on the random variable like X - E[X].
    You can verify that this is correct?

  12. #12
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence

    Almost done. But you should further show that the term \frac {Var[S_n]} {n^2} indeed converges to 0, since at this point Var[S_n] is still depends on n

  13. #13
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence


    Quote Originally Posted by BGM View Post
    Almost done. But you should further show that the term \frac {Var[S_n]} {n^2} indeed converges to 0, since at this point Var[S_n] is still depends on n
    Thank you, solved.

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats