The mechanics for Bayesian linear regression follow the same logic as that which was described in the previous chapter. The only real difference is that we will specify a distribution for the residuals, which will be distributed according to a Gaussian distribution, with 0 mean and a certain variance. These residuals will originate as the subtraction of the actual values, minus the expected ones. These expected values will be equal to the sum of several coefficients times certain variables.
In a linear regression context, we want to build inferences on the coefficients. But here (as we have already mentioned), we will estimate a density for each posterior.