Announcements! ( See All )
1/24 - Please check Weekly Schedule for the most up-to-date information on office hours.
1/16 - Textbook has been released! Reload the page if the math doesn't render properly.

Week 14 Preparation Guide

Reading

Required:

  • Textbook Chapters 24 and 25

Recommended:

  • Pitman Section 6.4. Focus on regression.
  • Pitman’s text doesn’t cover random vectors or the multivariate normal disribution. You might like this summary from Prof. Ingo Ruczinski of Johns Hopkins’ Biostatistics department.

Practice Problems

Pitman x.y.z means Exercise z of Section x.y and x.rev.z means Exercise z of the Review Exercises at the end of Chapter x.

  • Pitman 6.5.1, 6.5.3, 6.5.9, 6.5.12
  • Let $\mathbf{X}$ be an $n \times 1$ random vector and suppose we are trying to predict a random variable $Y$ by a linear function of $\mathbf{X}$. We identified the least squares linear predictor by restricting our search to linear functions of $X$ that were unbiased for $Y$. Show that this was a legitimate move. Specifically, let $\hat{Y}_1 = \mathbf{c}^T \mathbf{X} + d$ be a biased predictor so that $E(\hat{Y}_1) \ne \mu_Y$. Find a non-zero constant $k$ such that $\hat{Y}_2 = \hat{Y}_1 + k$ is unbiased, and show that $MSE(\hat{Y}_1) \ge MSE(\hat{Y}_2)$. This will show that the least squares linear predictor has to be unbiased.

Discussion Section

  • 6.5.2, 6.5.12, unbiased linear predictor, wrap up