# how to prove COV(y bar, estimated beta1) in linear regression equals 0

#### hehe1223

##### New Member
Yi=Xi*beta1 + beta0 + error
Prove:
COV(average of Yi, estimated beta1) = 0

Thanks!

Last edited:

#### Dragan

##### Super Moderator
Yi=a*beta1 + beta0 + error
Prove:
COV(average of Yi, estimated beta1) = 0

Thanks!
Is "a" just a constant?

#### hehe1223

##### New Member
Is "a" just a constant?
sorry. typo. I have corrected. #### Dragan

##### Super Moderator
sorry. typo. I have corrected. I am sorry to tell you this, but your proposition is not correct.

More specifically, the covariance between between the mean of Y and the estimated regression slope is not zero.

Simply, it is:

Cov(Ybar, b1) = Xbar*Sigma^2 / ∑Xi^2.

#### hehe1223

##### New Member
I am sorry to tell you this, but your proposition is not correct.

More specifically, the covariance between between the mean of Y and the estimated regression slope is not zero.

Simply, it is:

Cov(Ybar, b1) = Xbar*Sigma^2 / ∑Xi^2.
Really? Cannot agree.
I actually figured out a troublesome way (don't think it is the expected way, thus asking here) of doing it and indeed get 0.

Some known and provable facts:
Var(estimated beta1) = sigma^2/Sxx
Cov(estimated beta1, estimated beta0) = -sigma^2*xbar/Sxx (something like what you claimed for Cov(Ybar, b1))

Cov(Ybar, b1') = E(Ybar*b1') - E(Ybar)*E(b1')
=E(xbar*b1' + b0')*b1' -(xbar*b1+b0)*b1
=xbar*E(b1'^2) +E(b1'*b0') -xbar*b1^2-b0*b1
=xbar(Var(b1')+b1^2) + b1*b0 + Cov(b1', b0') - xbar*b1^2 - b0*b1
= ...
=0

What do you think?

#### Dragan

##### Super Moderator
Really? Cannot agree.
I actually figured out a troublesome way (don't think it is the expected way, thus asking here) of doing it and indeed get 0.

Some known and provable facts:
Var(estimated beta1) = sigma^2/Sxx
Cov(estimated beta1, estimated beta0) = -sigma^2*xbar/Sxx (something like what you claimed for Cov(Ybar, b1))

Cov(Ybar, b1') = E(Ybar*b1') - E(Ybar)*E(b1')
=E(xbar*b1' + b0')*b1' -(xbar*b1+b0)*b1
=xbar*E(b1'^2) +E(b1'*b0') -xbar*b1^2-b0*b1
=xbar(Var(b1')+b1^2) + b1*b0 + Cov(b1', b0') - xbar*b1^2 - b0*b1
= ...
=0

What do you think?

Have a look at this link. I sketched the proof in my last post.

#### Dragan

##### Super Moderator
Have a look at this link. I sketched the proof in my last post.

Okay, I think I see what is going on now.

My original assertion (which is true) is for the special case of when the intercept term is zero. I missed that subtle point when I went back and looked at my previous post i.e. the model was specified without an intercept term.

So, yes, in general, I believe it is correct that the covariance should be zero.

I think a quick way to write this would be:

$$Cov[\bar{Y},\tilde{\beta }_{1}]=E[(\bar{Y}-E[\bar{Y}])(\tilde{\beta }_{1}-E[\tilde{\beta }_{1}])]$$

$$=E[(\bar{Y}-E[\bar{Y}])(\tilde{\beta }_{1}-\beta _{1})]$$

$$=E[(\tilde{\beta }_{0}-\beta _{0})(\tilde{\beta }_{1}-\beta _{1})+\bar{X}(\tilde{\beta }_{1}-\beta _{1})^{2}]$$

$$=Cov[\tilde{\beta }_{0},\tilde{\beta }_{1}]+E[\bar{X}(\tilde{\beta }_{1}-\beta _{1})^{2}]$$

$$=-\bar{X}Var[\tilde{\beta }_{1}]+\bar{X}Var[\tilde{\beta }_{1}]=0$$

where the first term in the last part would not appear if the intercept term is zero.

#### hehe1223

##### New Member
Okay, I think I see what is going on now.

My original assertion (which is true) is for the special case of when the intercept term is zero. I missed that subtle point when I went back and looked at my previous post i.e. the model was specified without an intercept term.

So, yes, in general, I believe it is correct that the covariance should be zero.

I think a quick way to write this would be:

$$Cov[\bar{Y},\tilde{\beta }_{1}]=E[(\bar{Y}-E[\bar{Y}])(\tilde{\beta }_{1}-E[\tilde{\beta }_{1}])]$$

$$=E[(\bar{Y}-E[\bar{Y}])(\tilde{\beta }_{1}-\beta _{1})]$$

$$=E[(\tilde{\beta }_{0}-\beta _{0})(\tilde{\beta }_{1}-\beta _{1})+\bar{X}(\tilde{\beta }_{1}-\beta _{1})^{2}]$$

$$=Cov[\tilde{\beta }_{0},\tilde{\beta }_{1}]+E[\bar{X}(\tilde{\beta }_{1}-\beta _{1})^{2}]$$

$$=-\bar{X}Var[\tilde{\beta }_{1}]+\bar{X}Var[\tilde{\beta }_{1}]=0$$

where the first term in the last part would not appear if the intercept term is zero.
Thank you so much!!
I was wondering for a long time.
Did not expect the intercept makes such a big difference. But it makes sense.
Thanks again.

#### natswim

##### New Member
how to prove COV(estimated beta0, estimated beta1) in linear regression is equals??

Yi=Xi*beta1 + beta0 + error

Prove:
COV(estimated beta0, estimated beta1) = 0

Thanks!:wave:

#### Dragan

##### Super Moderator
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Yi=Xi*beta1 + beta0 + error

Prove:
COV(estimated beta0, estimated beta1) = 0

Thanks!:wave:
Where are you getting the idea from that the covariance between the estimated intercept term and the estimated slope coefficient would --in general-- be zero???

#### natswim

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Where are you getting the idea from that the covariance between the estimated intercept term and the estimated slope coefficient would --in general-- be zero???

I'm so sorry...
I wanna know this:

Yi=Xi*beta1 + beta0 + error

COV(estimated beta0, estimated beta1) is ????

Thanks!

Sorry!!

#### Dragan

##### Super Moderator
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

I'm so sorry...
I wanna know this:

Yi=Xi*beta1 + beta0 + error

COV(estimated beta0, estimated beta1) is ????

Thanks!

Sorry!!

You can find the proof of this in other threads here - I did it not too long ago. This question has come up a few times. It goes like this (I'll be a little neater this time):

$$Cov\left [ b_{0} ,b_{1}\right ]=E\left [ \left ( b_{0}-E\left [ b_{0}\right ] \right )\left ( b_{1}-E\left [b _{1} \right ] \right ) \right ]$$

$$=E\left [ \left ( b_{0}-\beta _{0} \right )\left (b _{1} -\beta _{1}\right ) \right ]$$

$$=-\bar{X}E\left [ b_{1}-\beta _{1} \right ]^{2}$$

$$=-\bar{X}Var\left [b _{1} \right ]$$

$$=-\bar{X}\frac{\sigma ^{2}}{SS_{X}}$$

where I am making use of

$$b_{0}=\bar{Y}-b_{1}\bar{X}$$

$$E\left [b _{0} \right ]=\bar{Y}-\beta _{1}\bar{X}$$

giving

$$\left (b _{0}-E\left [b _{0} \right ] \right )=-\bar{X}\left ( b_{1}-\beta _{1} \right )$$.

#### natswim

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Thanks by the answer. You are brilliant!

Last edited:

#### natswim

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

You can find the proof of this in other threads here - I did it not too long ago. This question has come up a few times. It goes like this (I'll be a little neater this time):

$$Cov\left [ b_{0} ,b_{1}\right ]=E\left [ \left ( b_{0}-E\left [ b_{0}\right ] \right )\left ( b_{1}-E\left [b _{1} \right ] \right ) \right ]$$

$$=E\left [ \left ( b_{0}-\beta _{0} \right )\left (b _{1} -\beta _{1}\right ) \right ]$$

$$=-\bar{X}E\left [ b_{1}-\beta _{1} \right ]^{2}$$

$$=-\bar{X}Var\left [b _{1} \right ]$$

$$=-\bar{X}\frac{\sigma ^{2}}{SS_{X}}$$

where I am making use of

$$b_{0}=\bar{Y}-b_{1}\bar{X}$$

$$E\left [b _{0} \right ]=\bar{Y}-\beta _{1}\bar{X}$$

giving

$$\left (b _{0}-E\left [b _{0} \right ] \right )=-\bar{X}\left ( b_{1}-\beta _{1} \right )$$.

Thanks by the answer. You are brilliant!:tup:

#### Dragan

##### Super Moderator
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Thanks by the answer. You are brilliant!:tup:
Well, actually, I am not brilliant...this is fairly basic stuff.

#### hehe1223

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

You can find the proof of this in other threads here - I did it not too long ago. This question has come up a few times. It goes like this (I'll be a little neater this time):

$$Cov\left [ b_{0} ,b_{1}\right ]=E\left [ \left ( b_{0}-E\left [ b_{0}\right ] \right )\left ( b_{1}-E\left [b _{1} \right ] \right ) \right ]$$

$$=E\left [ \left ( b_{0}-\beta _{0} \right )\left (b _{1} -\beta _{1}\right ) \right ]$$

$$=-\bar{X}E\left [ b_{1}-\beta _{1} \right ]^{2}$$

$$=-\bar{X}Var\left [b _{1} \right ]$$

$$=-\bar{X}\frac{\sigma ^{2}}{SS_{X}}$$

where I am making use of

$$b_{0}=\bar{Y}-b_{1}\bar{X}$$

$$E\left [b _{0} \right ]=\bar{Y}-\beta _{1}\bar{X}$$

giving

$$\left (b _{0}-E\left [b _{0} \right ] \right )=-\bar{X}\left ( b_{1}-\beta _{1} \right )$$.

The last part looks suspicious. Shouldn't it be E(Ybar) instead of Ybar??
Confused.

#### Dragan

##### Super Moderator
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

The last part looks suspicious. Shouldn't it be E(Ybar) instead of Ybar??
Confused.
What I wrote is correct hehe1223.

More specifically, the sample regression function can be expressed as:

$$Y_{i}=\hat{\beta _{0}}+\hat{\beta _{1}}X_{i}+e_{i}$$

which estimates the population regression function written as:

$$Y_{i}=\beta _{0}+\beta _{1}X_{i}+u_{i}$$.

As such, it is legitimate to write,

$$\bar{Y}=\beta _{0}+\beta _{1}\bar{X}+\bar{u}$$.

Now, we know that the estimate of the intercept term is an unbiased estimate i.e.

$$E\left [\hat{\beta _{0}}\right ]=\beta _{0}$$

Therefore,

$$\beta _{0}=\bar{Y}-\beta _{1}\bar{X}$$

where the mean of the stochastic disturbance term is (by assumption) zero.

#### hehe1223

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

What I wrote is correct hehe1223.

More specifically, the sample regression function can be expressed as:

$$Y_{i}=\hat{\beta _{0}}+\hat{\beta _{1}}X_{i}+e_{i}$$

which estimates the population regression function written as:

$$Y_{i}=\beta _{0}+\beta _{1}X_{i}+u_{i}$$.

As such, it is legitimate to write,

$$\bar{Y}=\beta _{0}+\beta _{1}\bar{X}+\bar{u}$$.

Now, we know that the estimate of the intercept term is an unbiased estimate i.e.

$$E\left [\hat{\beta _{0}}\right ]=\beta _{0}$$

Therefore,

$$\beta _{0}=\bar{Y}-\beta _{1}\bar{X}$$

where the mean of the stochastic disturbance term is (by assumption) zero.
I don't think you are correct. When there are finite number of observations. Errors do not sum up to 0!

#### Dragan

##### Super Moderator
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

I don't think you are correct. When there are finite number of observations. Errors do not sum up to 0!
Okay, let me back-up and repeat.

That is, again, it is as I said:

$$Cov\left [ b_{0} ,b_{1}\right ]=E\left [ \left ( b_{0}-E\left [ b_{0}\right ] \right )\left ( b_{1}-E\left [b _{1} \right ] \right ) \right ]$$

$$=E\left [ \left ( b_{0}-\beta _{0} \right )\left (b _{1} -\beta _{1}\right ) \right ]$$

$$=-\bar{X}E\left [ b_{1}-\beta _{1} \right ]^{2}$$

$$=-\bar{X}Var\left [b _{1} \right ]$$

where I am making use of

$$b_{0}=\bar{Y}-b_{1}\bar{X}$$

$$E\left [b _{0} \right ]=\bar{Y}-\beta _{1}\bar{X}$$

giving

$$\left (b _{0}-E\left [b _{0} \right ] \right )=-\bar{X}\left ( b_{1}-\beta _{1} \right )$$.

Now, if you want to provide a different algebraic proof, then please do so and I'll look at, Mkay.

#### hehe1223

##### New Member
Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

The problem is that you cannot assume E(Ybar)=Ybar, error terms do not sum to 0.
I later found a way to prove Cov(ybar, slopehat) = 0
if you express slopehat as Sxy/Sxx, you get numerator in terms of sum of (xi-xbar)*yi, express ybar as sum of yi/n. Cov(sum of yi, yi) = var(yi)=variance of error, because all yi are independent. make use of sum of (xi-xbar)=0, you will get final answer as 0.