Lecture 6: Estimating the variance of errors

Economics 326 — Methods of Empirical Research in Economics

Author

Vadim Marmer, UBC

The importance of \sigma^2

  • The variance of \hat{\beta} depends on the unknown \sigma^{2} = \mathrm{E}\left[U_i^{2} \mid \mathbf{X}\right]: \mathrm{Var}\left(\hat{\beta} \mid \mathbf{X}\right) = \frac{\sigma^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right)^{2}}.
  • If U’s were observable, we could estimate \sigma^{2} by \frac{1}{n}\sum_{i=1}^{n}U_{i}^{2}, which is unbiased. This is not possible as U’s are unobservable.
  • Using sample residuals instead, \hat{U}_{i}=Y_{i}-\hat{\alpha}-\hat{\beta}X_{i}, gives a feasible estimator: \hat{\sigma}^{2}=\frac{1}{n}\sum_{i=1}^{n}\hat{U}_{i}^{2},
  • \hat{\sigma}^{2} is biased.

An unbiased estimator of \sigma^2

  • An unbiased estimator of \sigma^{2} is s^{2}=\frac{1}{n-2}\sum_{i=1}^{n}\hat{U}_{i}^{2}.
  • Assumptions:
    1. Y_{i}=\alpha +\beta X_{i}+U_{i}.
    2. \mathrm{E}\left[U_{i}\mid \mathbf{X}\right] =0 for all i.
    3. \mathrm{E}\left[U_{i}^{2}\mid \mathbf{X}\right] =\sigma ^{2} for all i.
    4. \mathrm{E}\left[U_{i}U_{j}\mid \mathbf{X}\right] =0 for all i\neq j.
  • Since \hat{U}_{i}=Y_{i}-\hat{\alpha}-\hat{\beta}X_{i}, dividing by n-2 adjusts for estimating two parameters: \alpha and \beta.

Expressing \hat{U}_i in terms of U_i

  • \hat{U}_{i}=Y_{i}-\hat{\alpha}-\hat{\beta}X_{i}

  • \hat{\alpha}=\bar{Y}-\hat{\beta}\bar{X}, so \hat{U}_{i} = \left( Y_{i}-\bar{Y}\right) -\hat{\beta}\left( X_{i}-\bar{X}\right)

  • Also: Y_{i}-\bar{Y} = \beta \left( X_{i}-\bar{X}\right) +U_{i}-\bar{U},

  • We have the following relationship between \hat{U}_i and U_i: \hat{U}_{i}=U_{i}-\bar{U} -\left( \hat{\beta}-\beta \right)\left( X_{i}-\bar{X}\right).

  • \hat U_i is related to U_i, but it is contaminated by the estimation errors.

Expanding \sum_{i=1}^{n}\hat{U}_i^2

  • We have: \hat{U}_{i} = \left( Y_{i}-\bar{Y}\right) -\hat{\beta}\left( X_{i}-\bar{X}\right)

  • The squared residual is: \begin{align*} \hat{U}_{i}^{2} &= \left( U_{i}-\bar{U}\right) ^{2}+\left( \hat{\beta}-\beta \right)^{2}\left( X_{i}-\bar{X}\right)^{2} \\ &\quad -2\left( \hat{\beta}-\beta \right) \left( X_{i}-\bar{X}\right) \left( U_{i}-\bar{U}\right). \end{align*}

  • Thus, \begin{align*} \sum_{i=1}^{n}\hat{U}_{i}^{2} &=\sum_{i=1}^{n}\left( U_{i}-\bar{U}\right) ^{2}\\ &\quad+\left( \hat{\beta}-\beta \right) ^{2}\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2} \\ &\quad-2\left( \hat{\beta}-\beta \right) \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) \left( U_{i}-\bar{U}\right). \end{align*}

  • To show \mathrm{E}\left[\sum_{i=1}^{n}\hat{U}_{i}^{2}\mid \mathbf{X}\right]=\left( n-2\right) \sigma ^{2}, we verify the three terms on the RHS:

  • Claim 1: \quad\mathrm{E}\left[\sum_{i=1}^{n}\left( U_{i}-\bar{U}\right) ^{2}\mid \mathbf{X}\right]=\left( n-1\right) \sigma ^{2}.

  • Claim 2: \quad\mathrm{E}\left[\left( \hat{\beta}-\beta \right) ^{2}\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}\mid \mathbf{X}\right]=\sigma ^{2}.

  • Claim 3: \quad-2\cdot \mathrm{E}\left[\left( \hat{\beta}-\beta \right) \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) \left( U_{i}-\bar{U}\right)\mid \mathbf{X}\right]=-2 \cdot\sigma ^{2}.

Estimating the variance of \hat{\beta}

  • Variance of \hat{\beta} conditional on \mathbf{X}: \mathrm{Var}\left(\hat{\beta}\mid \mathbf{X}\right) =\frac{\sigma ^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}.
  • Estimator of \sigma ^{2}: s^{2}=\frac{1}{n-2}\sum_{i=1}^{n}\hat{U}_{i}^{2}=\frac{1}{n-2}\sum_{i=1}^{n}\left( Y_{i}-\hat{\alpha}-\hat{\beta}X_{i}\right) ^{2}.
  • Estimator of the variance of \hat{\beta}: \widehat{\mathrm{Var}}\left( \hat{\beta}\right) =\frac{s^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}.
  • Standard error of \hat{\beta}: \mathrm{SE}\left( \hat{\beta}\right) =\sqrt{\widehat{\mathrm{Var}}\left( \hat{\beta}\right)}=\sqrt{\frac{s^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}}.

Example in R

  • Regress hourly wage on years of education using the wage1 dataset from Wooldridge:
library(wooldridge)
data("wage1")
fit <- lm(wage ~ educ, data = wage1)
summary(fit)

Call:
lm(formula = wage ~ educ, data = wage1)

Residuals:
    Min      1Q  Median      3Q     Max 
-5.3396 -2.1501 -0.9674  1.1921 16.6085 

Coefficients:
            Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.90485    0.68497  -1.321    0.187    
educ         0.54136    0.05325  10.167   <2e-16 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 3.378 on 524 degrees of freedom
Multiple R-squared:  0.1648,    Adjusted R-squared:  0.1632 
F-statistic: 103.4 on 1 and 524 DF,  p-value: < 2.2e-16
  • Std. Error column: standard errors \mathrm{SE}(\hat{\alpha}) and \mathrm{SE}(\hat{\beta})
  • Residual standard error: s = \sqrt{s^2}; so s^2 = 3.378^2 \approx 11.41
  • The estimate s^2 can also be extracted directly:
summary(fit)$sigma^2
[1] 11.41352

Proof of Claim 1

\begin{align*} \sum_{i=1}^{n}\left( U_{i}-\bar{U}\right) ^{2} &= \sum_{i=1}^{n}U_{i}^{2}-\frac{1}{n}\left( \sum_{i=1}^{n}U_{i}\right) ^{2} \\ &= \sum_{i=1}^{n}U_{i}^{2}-\frac{1}{n}\left( \sum_{i=1}^{n}U_{i}^{2}+\sum_{i=1}^{n}\sum_{j\neq i}U_{i}U_{j}\right). \end{align*} Taking conditional expectations and using the assumptions, \begin{align*} \mathrm{E}\left[\sum_{i=1}^{n}\left( U_{i}-\bar{U}\right) ^{2}\mid \mathbf{X}\right] &= n\sigma ^{2}-\frac{1}{n}\cdot n\sigma ^{2} \\ &= \left( n-1\right) \sigma ^{2}. \end{align*}

Proof of Claim 2

Because \mathrm{E}\left[\hat{\beta}\mid \mathbf{X}\right]=\beta, \mathrm{E}\left[\left( \hat{\beta}-\beta \right) ^{2}\mid \mathbf{X}\right]=\mathrm{Var}\left(\hat{\beta}\mid \mathbf{X}\right)=\frac{\sigma ^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}. Hence, \mathrm{E}\left[\left( \hat{\beta}-\beta \right) ^{2}\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}\mid \mathbf{X}\right]=\sigma^{2}.

Proof of Claim 3

Note that \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) \left( U_{i}-\bar{U}\right) =\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) U_{i}, and \hat{\beta}-\beta =\frac{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) U_{i}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}. Therefore, \begin{align*} \left( \hat{\beta}-\beta \right) \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right)\left( U_{i}-\bar{U}\right) &=\frac{1}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}\left( \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) U_{i}\right) ^{2}. \end{align*} Conditionally on \mathbf{X}, \begin{align*} &\mathrm{E}\left[\left( \hat{\beta}-\beta \right) \sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) \left( U_{i}-\bar{U}\right)\mid \mathbf{X}\right] \\ &\quad = \frac{\sigma ^{2}\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}{\sum_{i=1}^{n}\left( X_{i}-\bar{X}\right) ^{2}}=\sigma ^{2}. \end{align*}

Putting it together

Using the three expectations above, \mathrm{E}\left[\sum_{i=1}^{n}\hat{U}_{i}^{2}\mid \mathbf{X}\right]=\left( n-1\right) \sigma ^{2}+\sigma^{2}-2\sigma ^{2}=\left( n-2\right) \sigma ^{2}, so s^{2} is unbiased for \sigma^{2}.