In Chapter 3, Modeling with Linear Regression, and Chapter 4, Generalizing Linear Models, we learned to build models of the general form:
Here, is a parameter for some probability distribution (for example, the mean of a Gaussian), the parameter of a binomial, the rate of a Poisson distribution, and so on. We call the inverse link function and is a function that is the square root or a polynomial function. For the simple linear regression case, is the identity function.
Fitting (or learning) a Bayesian model can be seen as finding the posterior distribution of the weights , and thus, this is known as the weight-view of approximating functions. As we have already seen, with the polynomial regression example, by letting be a non-linear function, we can map the inputs onto a feature space. We then fit a linear relation in the feature space...