Diving deeper into Bayesian inference
Bayesian inference is a statistical method that makes use of conditional probability to update the prior beliefs about the parameters of a statistical model given the observed data. The output of Bayesian inference is a posterior distribution, which is a probability distribution that represents our updated beliefs about the parameter after observing the data.
When calculating the exact posterior distribution is difficult, we would often resort to MCMC, which is a technique for estimating the distribution of a random variable. It’s a method commonly used to generate samples from the posterior distribution in Bayesian inference, especially when the dimensionality of the model parameters is high, making an analytical solution intractable.
The following section introduces the normal-normal model and uses MCMC to estimate its posterior distribution.
Introducing the normal-normal model
The normal-normal model is another foundational...