(1) The correlation between two orthogonal predictor variables is:

a. 1.0

b. 0.0

c. 0.5

d. Correlation does not exist.

(2) Consider a regression model with predictor variables X1, X2, and X3. If X1 has a VIF value of 2, then the R-Squared value from regressing X1 on X2 and X3 is:

a. 0.0

b. 0.25

c. 0.75

d. 0.50

(3) How can multicollinearity affect regression models?

a. Unstable regression coefficients, i.e. regression coefficients, will change sign as variables are added or deleted from the model.

b. Estimates of regression coefficients will have large variances.

c. Regression coefficients will show as statistically significant when they should not.

d. Regression coefficients will be both unstable, i.e. regression coefficients will change sign as variables are added or deleted from the model, and have large variances.

(4) Diagnostics for multicollinearity include:

a. The Overall F-Test

b. Variance Inflation Factors

c. The condition index for the X’X matrix

d. Both variance inflation factors and the condition index for the X’X matrix.

(5) Suppose we have 5 variables: X1, X2, X3, X4, and X5 in a data set with 2000 observations. We use the covariance matrix to compute the principal components. How many principal components are there?

a. 2

b. 5

c. 2000

d. 400

(6) Suppose we have 5 variables: X1, X2, X3, X4, and X5 in a data set with 2000 observations. We use the covariance matrix to estimate the common factors. How many common factors are there?

a. 2

b. 5

c. 2000

d. The number of common factors cannot be determined a priori.

(7) Given the variables X1, X2, X3, X4, and X5. The eigenvector associated with the largest eigenvalue is (0.5,0,-0.2,0,0.7). How do we compute the first principal component?

a. 0.2*(X1 + X2 + X3 + X4 + X5)

b. 0.5*X5 – 0.2*X3 + 0.7*X1

c. 0.2*(X1 + X2 + X3 + X4 + X5)/5

d. 0.5*X1 – 0.2*X3 + 0.7*X5

(8) If the sum of the eigenvalues is 90 and the second eigenvalue is 15, how much of the variance is explained by the second principal component?

a. 85.0%

b. 16.7%

c. 15.0%

d. 12.5%

(9) Similarities between factor analysis and principal components analysis include:

a. Both aim to reduce the dimensionality of the data.

b. Both are estimated with the assumption of an underlying statistical model.

c. Both try to explain the correlations between the predictor variables.

d. Both are not useful when the predictor variables are uncorrelated.

e. Both aim to reduce the dimensionality of the data and are not useful when the predictor variables are uncorrelated.

(10) Rotations are used in factor analysis to:

a. Improve the model fit.

b. Improve the interpretability of the model.

c. Change the number of common factors to include in the model.

d. Increase the variance explained by the model.

Attempt:

1) b

2) d

3) c

4) d

5) b

6) d

7) d

8) b

9) e

10) a