1. ## Convergence question

X1, X2,.......Xn is a sequence of random variables. Show that Xn converges to b in quadratic mean if and only if

The limit (n approaching infinity) of E(Xn) = b and
The limit (n approaching infinity) of V(Xn) = 0.

We know that Xn converges to X in quadratic mean if E(Xn - X)^2 -> 0 as n approaches infinity and Xn converges to c in quadratic mean if E(Xn - c)^2 -> 0 as n approaches infinity.

What's throwing me is the presence of V in terms of solving the problem. Any assistance would be greatly appreciated. Thanks.

2. ## Re: Convergence question

I'm assuming V(Xn) is variance. Is there not a way to write variance in terms of Expectation? This is beyond my scope, but that's one thing I'm thinking about.

3. ## Re: Convergence question

Originally Posted by greg6363
X1, X2,.......Xn is a sequence of random variables. Show that Xn converges to b in quadratic mean if and only if

The limit (n approaching infinity) of E(Xn) = b and
The limit (n approaching infinity) of V(Xn) = 0.

We know that Xn converges to X in quadratic mean if E(Xn - X)^2 -> 0 as n approaches infinity and Xn converges to c in quadratic mean if E(Xn - c)^2 -> 0 as n approaches infinity.

What's throwing me is the presence of V in terms of solving the problem. Any assistance would be greatly appreciated. Thanks.
Would it not be better to think of this in the following manner:

Suppose that is a sequence of independent random variables from a common distribution,
that has a mean and variance .

Let be the sample mean. Thus,

.

As such, this implies that , i.e., convergence in quadratic mean, as .

 Tweet

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts