# Thread: how to prove COV(y bar, estimated beta1) in linear regression equals 0

1. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Originally Posted by Dragan
You can find the proof of this in other threads here - I did it not too long ago. This question has come up a few times. It goes like this (I'll be a little neater this time):

where I am making use of

giving

.

The last part looks suspicious. Shouldn't it be E(Ybar) instead of Ybar??
Confused.

2. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Originally Posted by hehe1223
The last part looks suspicious. Shouldn't it be E(Ybar) instead of Ybar??
Confused.
What I wrote is correct hehe1223.

More specifically, the sample regression function can be expressed as:

which estimates the population regression function written as:

.

As such, it is legitimate to write,

.

Now, we know that the estimate of the intercept term is an unbiased estimate i.e.

Therefore,

where the mean of the stochastic disturbance term is (by assumption) zero.

3. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Originally Posted by Dragan
What I wrote is correct hehe1223.

More specifically, the sample regression function can be expressed as:

which estimates the population regression function written as:

.

As such, it is legitimate to write,

.

Now, we know that the estimate of the intercept term is an unbiased estimate i.e.

Therefore,

where the mean of the stochastic disturbance term is (by assumption) zero.
I don't think you are correct. When there are finite number of observations. Errors do not sum up to 0!

4. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Originally Posted by hehe1223
I don't think you are correct. When there are finite number of observations. Errors do not sum up to 0!
Okay, let me back-up and repeat.

That is, again, it is as I said:

where I am making use of

giving

.

Now, if you want to provide a different algebraic proof, then please do so and I'll look at, Mkay.

5. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

The problem is that you cannot assume E(Ybar)=Ybar, error terms do not sum to 0.
I later found a way to prove Cov(ybar, slopehat) = 0
if you express slopehat as Sxy/Sxx, you get numerator in terms of sum of (xi-xbar)*yi, express ybar as sum of yi/n. Cov(sum of yi, yi) = var(yi)=variance of error, because all yi are independent. make use of sum of (xi-xbar)=0, you will get final answer as 0.

6. ## The Following 2 Users Say Thank You to hehe1223 For This Useful Post:

fouriertransformer (05-25-2012), lemonlime (04-05-2013)

7. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Originally Posted by hehe1223
The problem is that you cannot assume E(Ybar)=Ybar, error terms do not sum to 0.
I later found a way to prove Cov(ybar, slopehat) = 0
if you express slopehat as Sxy/Sxx, you get numerator in terms of sum of (xi-xbar)*yi, express ybar as sum of yi/n. Cov(sum of yi, yi) = var(yi)=variance of error, because all yi are independent. make use of sum of (xi-xbar)=0, you will get final answer as 0.
I'll repeat, please provide me with your proof of the following (which is true) and I will look at it, Mkay.

Here it is:

.

Now, show me your algebraic proof.

8. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

This thread is rather old, but I came across it while trying to answer the same question and I thought I should provide the details for others who may inquire in the future. User hehe1223 gives the correct approach; namely:

First express the estimate of the slope as

where . Now, as hehe1223 suggests, express as a sum of 's and use the bilinearity of the covariance operator:

(Note that we are implicitly conditioning on the 's, hence we may treat them as constants, i.e. non-random.) Now, the 's are usually assumed to be independent; hence for so that only the terms in the double sum with survive, and the expression becomes a single sum. The usual assumption is that the error terms are homoscedastic, which is a fancy way of saying that they all have the same variance, say . Then , so that we have

where we have used the fact that for the last equality. I hope someone finds this helpful.
Note that the equality that Dragan asserts now follows from this. In their notation (see previous posts):

9. ## The Following 3 Users Say Thank You to fouriertransformer For This Useful Post:

lemonlime (04-05-2013), M!ss Moon (04-26-2013), runrunrun (09-10-2015)

10. ## Re: how to prove COV(y bar, estimated beta1) in linear regression equals 0

"This thread is rather old, but I came across it..I hope someone finds this helpful....Note that the equality that Dragan asserts now follows from this."

What I wrote in 2008 is correct and still holds today. In short, stop making yourself look like an idiot and wasting my time, Mkay.

11. ## Re: how to prove COV(estimated beta0, estimated beta1) in linear regression is equals

Thanks for the details fouriertransformer! I had the hint, like user hehe1223 said, to show cov(ybar, beta1_hat) = 0 first, but it was not clear how to show it. You make it very clear and easy to follow. Thanks!

12. ## Re: how to prove COV(y bar, estimated beta1) in linear regression equals 0

Originally Posted by Dragan
"This thread is rather old, but I came across it..I hope someone finds this helpful....Note that the equality that Dragan asserts now follows from this."

What I wrote in 2008 is correct and still holds today. In short, stop making yourself look like an idiot and wasting my time, Mkay.
I am surprised a super moderator will reply in this way plus the fact the you are the one who made a mistake in the proof. Just look at the key part of your proof: beta_0 = y^bar-beta_1*x^bar, Y^bar is the only random variable in this equation, how can you equate a unknown constant with a random variable? Yes, part of what you wrote in 2008 is correct, and that is the conclusion part. And hehe1223 pointed your mistake out correctly for you. No one is wasting your time!