- Thread starter LizR
- Start date

Hiearchical regression does test (through an F test) if the R squared value increases signficantly as you add variables.

R^2 could be defined as \( \frac{SSE}{SST} = \frac{\sum (\hat y_i - \bar y)^2}{\sum (y_i - \bar y)^2} \). Hence R^2 measures the explained variance relative to the total variance. In computing an F-test you compare two models one restricted one undrestricted. Restricting all parameters to be equal to 0 except the constantterm is what you want to do if you want to test whether R^2 is significantly different from 0. The reason is that when you only have a constant in youre regressionmodel the constant takes on the value of the sample mean and hence the predicted value for any observation equal the mean, which means that \( R^2 = \frac{\sum (\hat y_i - \bar y)^2}{\sum (y_i - \bar y)^2} = 0 \) because \(\hat y_i = \bar y \) that is predicted value is equal to mean.

To compute the F-statistic in this case us formula:

\( F = \frac{R^2/k}{(1-R^2)/(n-k-1)} \) where k is number of restricted parameters, n number of observations and R^2 is from the unrestricted model that is youre value of 0.412.

Use F distribution F(k,n-k-1) to compute P-value

To compute the F-statistic in this case us formula:

\( F = \frac{R^2/k}{(1-R^2)/(n-k-1)} \) where k is number of restricted parameters, n number of observations and R^2 is from the unrestricted model that is youre value of 0.412.

Use F distribution F(k,n-k-1) to compute P-value

Last edited: