Enter MCMC – stage left
As mentioned earlier, we started with the coin flip examples because of the ease of determining the posterior distribution analytically-primarily because of the beta distribution's self-conjugacy with respect to the binomial likelihood function.
It turns out that most real-world Bayesian analyses require a more complicated solution. In particular, the hyper-parameters that define the posterior distribution are rarely known. What can be determined is the probability density in the posterior distribution for each parameter value. The easiest way to get a sense of the shape of the posterior is to sample from it many thousands of times. More specifically, we sample from all possible parameter values and record the probability density at that point.
How do we do this? Well, in the case of just one parameter value, it's often computationally tractable to just randomly sample willy-nilly from the space of all possible parameter values. For cases where we are using Bayesian...