In this chapter, we actually covered an awful lot of material. We saw the structure of the classical or dense neural network. We learned about activation and nonlinearity, and we learned about softmax. We then set up testing and training data and we learned how to construct the network with Dropout and Flatten. We also learned all about solvers, or how machine learning actually learns. We then explored hyperparameters, and finally, we fine-tuned our model with grid search.
In the next chapter, we'll take what we've learned and alter the structure of our network to build what is called a convolutional neural network (CNN).