Regression using gram-schmidt orthogonalization


Ninja say what!?!
Hello dear members,

Hoping again to solicit some help from the knowledgeable members here, I thought I'd come to you with this question.

I'm asked to write a program in C that takes as input an X an Y matrix and performs gram-schmidt orthogonalization to get the beta coefficients, standard errors of the parameters, estimated value of \( \sigma ^2\) and the residual values for each observation.

I've gotten the gram-schmidt orthogonalization down and have managed to get the beta coefficients. However, I'm unsure of the standard errors. I know that I can get the standard errors by inverting the upper triangular matrix (from the orthogonalization procedure), but haven't been able to verify that they're just the diagonal values from this inverted matrix.

In addition, I don't know how to get \( \sigma ^2\) from this method yet. I'm also told that I should only orthogonalize the first p columns in order to get the residuals (which would be the last column). Not sure what this is referring to though.



Ninja say what!?!
Hi fed,

Thanks for responding. I'm not sure what you're referring to though. My understanding is that I can orthogonalize the X matrix, allowing me to determine the beta coefficients with the equation \( R\boldsymbol\beta=Q^TY \), where R is the upper triangular matrix and Q is the orthogonalized X matrix.

Could you clarify? Thanks!
Oh, I see what you mean.

The standard errors are the square root of the diagonal elements of the inverted matrix times the estimated value of sigma. The estimated value of sigma^2 an be calculated as the sum of the squared residuals divided by the number of observations minus the number of parameters estimated.

I hope this helps.


Ninja say what!?!
LOL. Thanks a lot for the help! I found out about half an hour before getting on here. You seem knowledgeable though and I hope you become an asset here.


Ninja say what!?!
Ok...I hope someone can still help me. I'm getting wacky answers for my predicted and residual values, and think that I'm calculating my beta incorrectly.

What I'm doing is orthogonalizing my X matrix into Q (the orthogonalized matrix) and R (the upper triangular matrix).

I'm taking the Q matrix, transposing it, and multiplying it by Y (lets call this Z).

I'm then taking the resulting matrix (Z) and finding beta by dividing Z by the r diagonal and subtracting the projection of the remaining r's and betas.

I think I'm approaching this wrong. Does anyone see a flaw in my logic???
If you have Z, can't you get your beta's through back-substitution? That is, with least squares, you're tasked with solving Rb=Z. But R is upper-triangular, so you can easily solve this problem by setting the last element of b equal to the i,ith element of R, and working your way backwards.

Check out this PDF, under the QR decomposition section: