Multicollinearity: What Happens if the

Regressors Are Correlated?

You should know the formulas:

VIF =

1

1 – r2

23

(8)

var( ˆ 2) = 2

P

x22

i

VIF (9)

var( ˆ 3) = 2

P

x23

i

VIF (10)

1. Why does the classical linear model assume that there is no multicollinearity

among the explanatory variables? What is the consequence of perfect

multicollinearity for the regression coefficients and their standard errors?

Of less than perfect multicollinearity? (

2. What are the practical consequences of high multicollinearity?

3. What is the Variance-Inflating Factor ( VIF)? What d oes t he Variance-

Inflating Factor show? How is the Variance-Inflating Factor used to

detect multicollinearity?

4. Give three ways to detect multicollinearity (

5. Explain how transformation of variables can sometimes address the problem

of multicollinearity. Give two examples of possible transformations.

Heteroscedasticity: What Happens if the Error

Variance Is Nonconstant?

1. Illustrate the nature of homoscedasticity and heteroscedasticity in two

diagrams.

2. Give three (out of seven) reasons for heteroscedasticity. Briefly

explain.

3. What happens to ordinary least squares estimators and their variances in

the presence of heteroscedasticity?

September 16, 2022