Material for Midterm 2

A. Adhikari

Contents go through Chapter 17 of the textbook, that is, through the lecture on Tuesday 3/12.

Note that the new content since Midterm 1 is in Chapters 9 through 17. However, it is not possible to understand that material without first understanding Chapters 1 through 8.

General Concepts and Methods

Probability

  • Chapter 1, Lab 1: Spaces, events, basic counting, exponential approximation
  • Chapter 2: Addition and multiplication rules; conditioning and updating
  • Chapter 5: Unions and intersections of several events
  • Section 9.1: Probabilities by conditioning and recursion (discrete)
  • Section 4.1, 17.1: From joint distributions
  • Section 4.5, 17.2: Dependence and independence
  • Section 5.1, 12.3: Bounds – Boole, Markov, Chebyshev

Distribution

  • Chapter 3: Intro; equality versus equality in distribution
  • Chapter 4, Lab 2: Joint, marginals, conditionals, independence (discrete case), total variation distance
  • Section 5.3, 5.4: Random permutations and symmetry
  • Section 15.1, 15.2, Lab 6: Density
  • Section 6.1, 15.1, 16.3, Lab 6: CDF and inverse CDF
  • Chapter 16, Lab 6: Density of a transformation
  • Chapter 17: Joint, marginal, and conditional densities; independence
  • Chapter 14: Distribution of sum
  • Chapter 14, Section 15.3: Central Limit Theorem

Expectation

  • Chapter 8, Lab 3: The crucial properties (discrete case) including method of indicators, expectations of functions, tail sum formula (see also geometric distribution)
  • Section 9.2, 9.3: Expectation by conditioning
  • Section 15.3, 17.1: Expectation using densities and joint densities

Variance

  • Chapter 12: Definition and basic properties; linear transformations
  • Chapter 13, Lab 5: Covariance; variance of a sum (including dependent indicators and simple random sample sums)

Estimation and Prediction

  • Section 8.4: Unbiased estimators
  • Section 14.5, 14.6: IID sample mean; confidence interval for population mean
  • Homework 7: Unbiased estimator of a population variance
  • Section 12.2: Expectation as a least squares predictor

Models and Special Distributions

Markov Chains

  • Sections 10.1, 10.2: Terminology and basics
  • Sections 10.3, 10.4: The steady state distribution and its properties
  • Section 11.1: Balance and detailed balance
  • Sections 11.2, 11.3, Lab 4: Code Breaking and MCMC

Random Counts

Uniform \((a, b)\)

  • Section 15.3: Density, expectation, variance, CDF
  • Section 16.3, Lab 6: Use in simulation

Beta

  • Section 17.4: Integer parameters; uniform order statistics

Normal

  • Section 14.3, 14.4: CLT; Normal cdf and inverse cdf
  • Sections 14.6: Normal confidence intervals
  • Section 16.1: Normal densities

Gamma

  • Section 15.4, 16.1: Exponential and scaling
  • Homework 8: Gamma function, gamma density, mean, variance