# Trial lecture: Kalman filter

## Summary

This lecture introduced the **Kalman Filter**, an efficient data assimilation algorithm. Like all filters, it combines noisy observations with unreliable predictions to obtain improved state estimates. Leveraging assumptions of linearity and Gaussianity, the Kalman filter remains **Gaussian **at all times, permitting an elegant and efficient implementation. The filter seeks a **compromise **between unreliable predictions and unreliable observations, favouring the side with less uncertainty.

## Motivating example

Filters are important tools in **real-time applications**. Positioning systems such as GPS require exact knowledge the position of orbital satellites. In continually evolving systems - such as the orbital movement of satellites - it is difficult to denoise snapshot observations of a system's state.

Instead, reliable tracking requires a combination of observation and simulation. Such an approach takes the **system's evolution** into account, but relies on ultimately flawed model predictions. To compensate these errors, predictions are often endowed with an artificial forecast error, lowering our confidence in the prediction. This results in a **two-step recursion**:

An

**assimilation step**, which combines a new observation with a prior state estimate, thereby providing an improved posterior state estimate with reduced uncertainty.This is followed by a

**forecast step**, which uses this improved state estimate as the basis for a numerical prediction to the next time at which an observation becomes available. The forecast error increases uncertainty again, and forms the basis for the next assimilation step.

*Filters correct model drift by assimilating observations.*

## The Kalman Filter

The Kalman filter makes three assumptions about the system. Together, these assumptions ensure that the state estimate remains **Gaussian at all times**. This permits an elegant, analytical solution, and guarantees that the automated filter can always expect the state distribution to follow a well-defined mathematical structure.

### Linear-Gaussian forecast model

The forecast model evolves the state vector * x* at time

*t*to the next time step

*t+1*at which a new observation might become available.

where * A* is a linear operator (i.e., a matrix) and

*is the forecast error covariance matrix.*

**Q**### Linear-Gaussian observation model

The observation model relates the state vector * x* at to a noisy observation

*.*

**y**where * H* is a linear observation operator, often just a matrix of 0 and 1 extracting the relevant state vector entries, and

*is the observation error covariance matrix.*

**R**### Gaussian prior

The filter's recursion must start with an initial state estimate * x*, provided by a Gaussian prior:

where * µ* and

*are the prior distribution's mean and covariance, respectively.*

**Σ****Interactive element**

*. The Kalman filter's position update relies on three primary variables: the prior uncertainty (yellow), the observation error (blue), and the observation value (green). Adapt these values by dragging the sliders and observe how the compromise solution (the posterior; red) changes in response.*

## Assimilation step: The Kalman update

At its essence, Bayes' Theorem realizes two primary steps:

The construction of a

**joint distribution***p(***x***,***y***)=p(***x***)p(***y|x***)*from a prior*p(***x***)*and a likelihood (or observation model)*p(***y|x***)*.The subsequent conditioning of this joint distribution on a specific observation value

, which yields a conditional distribution**y****p(***x***|***y****)*known as the**posterior**, that provides an improved state estimate

This operation depends on the prior, the observation model, and the observation value. An interactive example is provided to the left. Due to the elegant assumptions of the Kalman filter, this posterior is also Gaussian, and its mean and covariance can be calculated in closed-form:

Where the update depends on the **Kalman gain** * K*, which realizes a compromise between the observations and predictions:

## Forecast step: Linear stochastic simulation

By contrast, the Kalman filter's prediction step is significantly more straightforward:

Note that the noise term does generally not influence the mean * µ*, as its mean is often defined as zero.

*All model predictions are imperfect to some degree. Filtering algorithms represent this imperfection through an artificial forecast error term. For the Kalman filter, this term is Gaussian, and often assumed unbiased (i.e.,*

**µ = 0).**## Application: Tracking a satellite

To demonstrate the performance of the Kalman filter for the problem of tracking a **satellite orbit's height**, we may explore the interactive example below. We observe:

The

**observation error**controls (a) the reduction of uncertainty during the assimilation step, and (b) how much the state estimate is shifted towards the observation value.The

**forecast error**controls how much uncertainty is inflated between the assimilation steps.

A robust filtering performance requires good estimates for both. Experiment with different error values, and see how it affects the filtering performance as quantified by the average log density of the true trajectory (dark blue) over the filter's state estimate (light blues). The higher the average log density, the better - but watch out for instabilities!

**Interactive element**

*. An interactive example of the Kalman filter's tracking example for a satellite orbit's height. Activate the filter by checking the "assimilate" checkbox, then observe changes for different observation and forecast values. A good combination of values are observation error = 0.5 and forecast error = 1.0.*