Data assimilation with particle filters
Below follows a brief, informal, step-by-step description of how particle filters can be used to correct the predictions of a flawed model with the help of position measurements.
An example: Simulating a cannonball's flight
To demonstrate an example of data assimilation, consider the following scenario: we want to understand the path of a cannonball fired off a cliff. In scientific practice, our models are always imperfect in one way or another. This causes the simulated dynamics to deviate from the true dynamics.
In this scenario, we represent this issue with a rather extreme oversight: we have forgotten to add gravity to our simulation. You can click the element on the left to start the simulation.
Deterministic and stochastic models
Clearly, the deterministic model we defined above is flawed: shortly after firing the cannon, the truth and simulation diverge. Soon, our model is no longer reconcilable with the truth. Stochasticity allows us to partially salvage the simulation by adding randomness.
Here, we have created an ensemble of ten particles, each using the flawed model, but different realizations of noise. With this, it is theoretically possible - albeit highly unlikely - for the stochastic model to reproduce the true flight path. In this sense, stochasticity compensates for flaws in the conceptual model by reducing our confidence in its predictions.
Tracking the truth with particle filters
A particle filter is one the simplest data assimilation algorithms. Building on a stochastic model, it can decrease the prediction uncertainty through the assimilation of new observations. Whenever a new observation becomes available, every particle is weighted according to its closeness to the observation. This step is often followed by a resampling step, duplicating particles with high weights while discarding others.
Through this quasi-evolutionary routine, particle filters can follow the true dynamics despite imperfections in the simulation setup. Mind that this does not remedy the cause of imperfection (here: the absence of gravity), but merely "pulls" the samples to the observations.
Sequential parameter inference
Most data assimilation algorithms can not only track time-variable system states, such as height, but also improve the model's parameters. For instance, we could include gravity as an uncertain parameter, and gradually infer its magnitude. One such approach is state-augmentation.
In this case, each particle also carries its own hypothesis concerning the gravitational constant, which affects how (and if) its cannonball falls towards the ground. These proposals for gravitational constants are passed along during resampling, then mutated with random noise. Similarly to biological evolution, this process optimizes the simulation and helps the simulation to approach the correct gravitational constant.