so... who else is feelin' the pinch this semester? oh man it's my last (course-intensive) semseter before finishin'up my master's (just waiting on the thesis heh) so i had to cram up 2 seminars on the stats dept and 2 courses on my home dept in education... in any case, best of lucks to everyone.

so last friday i got asked a somewhat intersting question that i haven't been able to quite figure out yet... it goes like this:

let's pretend that we have a regression that looks like \(Y_{1} = \beta _{0}+\beta_{1}X+\beta_{2}Z+\beta_{3}W+\epsilon\). now, as it usually happens in these cases, these variables have certain correlations so that \(r_{XY}, r_{XZ}, r_{XW}, r_{YZ}, r_{YW}, ...\) and you know, all of those are not zero.

say that i now have a reduced regression model that looks like \(Y_{2} = \beta _{0}+\beta_{1}X+\beta_{2}Z+\epsilon\), so it's the same as the previous one but without one predictor, \(W\).

the question would then be:

what would be the correlation between the ommitted predictor \(W\) and the predicted scores \(\widehat{Y_{2}}\) from the second, reduced model?

i am having a little bit of a hard time because there are a few too many correlations and i think the algebra's gonna get somewhat complicated if i try to sort it out by re-expressing \(\widehat{Y_{2}}\) in terms of its correlation with \(X\) and \(Z\) ...

oh god, i'm really hoping someone knows maybe a smart matrix algebra trick or some relationship (maybe through the reduced model's \(R^{2}\)) to simplify things before i kind of tackle this in full force...

thanks to everyone!