## Estimation Crash Course IV: Sufficient Statistics

Sufficient statistics

Sufficient statistics

The Cramér-Rao Bound is a lower bound on the variance of an unbiased estimator. Here we focus on one-parameter models.

Introduction of the notion of Fisher information: a quantity of central importance in statistics.

Statistics and Estimators: definitions of statisics and an introduction to the concepts of bias and variance of an estimator; several examples.

We study the class of sub-Gaussian random variables: those random variables whose tails are dominated by a Gaussian. Such random variables satisfy Hoeffding-type bounds and possess several interesting properties. We also define the sub-Gaussian norm and study its properties.

A result on the convergence of sample mean and notes on some standard concentration inequalities such as the Markov, Chernoff, Hoeffding, and Chernoff’s bounds

We derive a useful formula that allows us to compute the conditional expectation of jointly normally distributed data; this result plays a central role in the development of the Kalman filter

An introduction to polynomial chaos expansions

Probability cookbook