Try to solve the problem by working with matrices. It becomes easier that way.
Hello everyone. I am taking a class in regression that's way over my head. We are working with the sample model Y = b0 + b1X1 + b2X2 + e. In it I am asked to show that the correlation between the predictor X1 and the predicted values Yhat can be obtained by dividing the simple correlation cor(X1,Y) by the square root of R-squared. I have verified this with sample datasets and I can see the relationship holds. So it is true that:
This is what I have tried so far but I think I'm not getting anywhere. Does anyone have a hint on how I can proceed?
I am starting off by noticing:
Now, I know that . The intercept is a constant so I can drop it and using the rules of covariance algebra I can expand this to be:
But when I expand the denominator (so the product of the square roots of and )
I end up with where the big first term is .
I don't see how the denominator is going to become R (the square root of R2), and I definitely not see where I am going to get from
Any help, please? I'm so stuck!
Try to solve the problem by working with matrices. It becomes easier that way.
Thank you, but this course assumes no knowledge of matrix algebra. My guess is that the solution can be obtained purely from covariance algebra and some properties of regression.
Do you have any insights?
hi,
maybe you could plug in the formula for b1 into the equation?
regards
Tweet |