In this final section, we will examine how to train an MLP for regression. When it comes to regression, there is a little more to say about the MLP. As it turns out, the only thing that changes is the activation function for the final nodes in the network that produces predictions. They allow for a wide range of outputs, not just the output from a set of classes. All the issues and hyperparameters are the same, as in the case of classification. Of course, in the regression context, you may end up making different choices than for classification.
So, let's now demonstrate regression using neural networks:
- We're going to be working with the Boston dataset. We're going to import MLPRegressor in order to be able to do the regression, and we're still going to be using the mean_squared_error metric to assess the quality of our fit, using the following...