+ Reply to Thread
Results 1 to 7 of 7

Thread: Convergence in probability

  1. #1
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Convergence in probability




    If Z_n=\frac{1}{n}\sum_{i=1}^nX_i^2 and W_n=\frac{1}{n}\sum_{i=1}^n(X_i-\bar{X})^2

    Show that V_n=Z_n - W_n \rightarrow^p \mu^2

    What I think:
    Z_n-W_n=\frac{1}{n}\sum_{i=1}^nX_i^2-(X_i-\bar{X})^2, so (X_i-\bar{X})^2=X_i^2-2X_i\bar{X}+\bar{X}^2

    now

    Z_n-W_n=\frac{1}{n}\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2 doing S_n=\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2 \rightarrow Z_n-W_n=\frac{S_n}{n}
    E[S_n]=E[\sum_{i=1}^n2X_i\bar{X}-\bar{X}^2] assuming X_i and \bar{X} are independent.
    E[S_n]=2n*E[X_1]E[\bar{X}]-n*E[\bar{X}]^2=2n*\mu*\mu-n\mu^2=n\mu^2 now applying chebyshev's
    P(|Z_n-W_n)-\mu^2|\geq\epsilon)=P(|\frac{S_n-E[S_n]}{n}|\geq\epsilon)=P(S_n-E[S_n]\geq n\epsilon)\leq\frac{Var(S_n)}{n^2\epsilon^2}\rightarrow0
    Finally
    V_n=Z_n - W_n \rightarrow^p \mu^2

  2. #2
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence in probability

    I guess you cannot assume X_i and \bar{X} are independent. One crucial thing here which you may missed is that

    Z_n - W_n = \bar{X}^2

    after some simplifications.

  3. #3
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence in probability

    Quote Originally Posted by BGM View Post
    I guess you cannot assume X_i and \bar{X} are independent. One crucial thing here which you may missed is that

    Z_n - W_n = \bar{X}^2

    after some simplifications.
    I cannot see how Z_n - W_n = \bar{X}^2

  4. #4
    Devorador de queso
    Points: 95,995, Level: 100
    Level completed: 0%, Points required for next Level: 0
    Awards:
    Posting AwardCommunity AwardDiscussion EnderFrequent Poster
    Dason's Avatar
    Location
    Tampa, FL
    Posts
    12,938
    Thanks
    307
    Thanked 2,630 Times in 2,246 Posts

    Re: Convergence in probability

    Quote Originally Posted by askazy View Post
    I cannot see how Z_n - W_n = \bar{X}^2
    I'll provide you some assurance that the equality in reference does in fact hold. Write everything out in summation form, expand it all out, cancel the terms that you can and things will simplify. One thing to keep in mind

    \sum_{i=1}^n X_i = n\bar{X}

    it's just a very simple change to the the definition of \bar{X} but it's useful to keep in mind when dealing with these summation problems.
    I don't have emotions and sometimes that makes me very sad.

  5. #5
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence in probability

    Quote Originally Posted by Dason View Post
    I'll provide you some assurance that the equality in reference does in fact hold. Write everything out in summation form, expand it all out, cancel the terms that you can and things will simplify. One thing to keep in mind

    \sum_{i=1}^n X_i = n\bar{X}

    it's just a very simple change to the the definition of \bar{X} but it's useful to keep in mind when dealing with these summation problems.
    Yeah, I can see it now. But I wonder if the way applied to Chebyshev's inequality is correct.

    Man I'm really confused now
    E[\bar{X}^2] or E[\bar{X}]^2
    Last edited by askazy; 12-05-2014 at 03:30 PM.

  6. #6
    Points: 3,621, Level: 37
    Level completed: 81%, Points required for next Level: 29
    askazy's Avatar
    Location
    Holand
    Posts
    160
    Thanks
    4
    Thanked 0 Times in 0 Posts

    Re: Convergence in probability

    Quote Originally Posted by BGM View Post
    I guess you cannot assume X_i and \bar{X} are independent. One crucial thing here which you may missed is that

    Z_n - W_n = \bar{X}^2

    after some simplifications.
    Anyway, is right?
    E[\bar{X}]=\mu\rightarrow E[\bar{X}]^2=\mu^2
    Var(\bar{X})=\frac{\sigma^2}{n}\rightarrow Var(\bar{X})^2=\frac{\sigma^4}{n^2}

    0\leq P(|\bar{X}^2-\mu^2|\geq\epsilon)\leq\frac{Var(\bar{X})^2}{\epsilon^2}\rightarrow0

  7. #7
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: Convergence in probability


    The Chebyshev inequality is:

    \Pr\{|X - E[X]| \geq kSD[X]\} \leq \frac {1} {k^2}

    where k > 0. In particular we put k = \frac {\epsilon} {SD[X]}, then we have

    \Pr\{|X - E[X]| \geq \epsilon\} \leq \frac {Var[X]} {\epsilon^2}


    From your work I think you have applied it correctly. The spirit of this useful bound is that if you want to show a sequence of random variable is converging towards to its common mean, then then you just need to show the variance is converging to zero.

    The tricky part here is that

    1. the calculation of the variance can be quite tedious
    2. the mean of \bar{X}_n^2 is a sequence of n, converging to \mu^2 but not exactly equal to \mu^2 (\bar{X}_n^2 is asymptotically unbiased, but biased estimator of \mu^2). So you may need to have a little adjustment before applying it.

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats