## Gaussian inference

In a nutshell, Bayes' Theorem realizes two steps:

The construction of a joint distribution p(x,y)=p(x)p(y|x) from a prior p(x) and a likelihood (or observation model) p(y|x).

Evaluating a conditional of this joint distribution given a specific observation value y*, yielding the posterior distribution p(x|y*).

This operation can rarely be solved in closed form. However, when both the prior and the likelihood are Gaussian, the joint pdf p(x,y) is always (multivariate) Gaussian. This permits a closed-form solution. Interact with the element to the left, and observe how the posterior solution changes (and what remains unchanged) as you adjust the following three variables:

the prior standard deviation,

the observation error standard deviation,

the observation value.