Thirteen lectueres on the Kalman Filter

I have released a series of video lectures on the Kalman filter, including an introduction to probability theory, Bayes’ theorem, minimum variance estimation, maximum likelihood and maximum a posteriori estimation. We start with a gentle introduction to probability theory (probability spaces, random variables, expectation, variance, density functions, etc) and move on to conditioning, which is a notion of central importance in estimation theory.

From basic probability theory to the Bayesian interpretation of the Kalman Filter in 13 video lectures

We then state Bayes’ theorem and the celebrated minimum variance estimation theory that states that the conditional expectation is an unbiased minimum variance estimator. Then, we derive the Kalman filter equations and give an interesting example: we use the Kalman filter to estimate the position of a vehicle from noisy GPS measurements (we have included a case where the connection to the GPS satellite is lost for a while, and then it is restored). Lastly, we show that the Kalman filter is BLUE (best linear unbiased estimator) and give a maximum a posteriori estimation interpretation.

Playlist

You can watch all the videos in this YouTube playlist.


Introduction to Probability

Position Estimation

Maximum likelihood estimation