As we saw in the first section of the chapter, all neural networks are learning the weights in each of the layers. Maybe we have millions of weights, but what a neural network is trying to figure out are good values. For example, first we do the forward pass, during which we generate the hypothesis. Then we compare the hypothesis with the real values of the data we have, and then come back with feedback that will change the weights in a way that the next forward pass will produce a better hypothesis. This feedback pass, or the backpropagation pass, updates all the weights:
We repeat this process of the forward pass and the backpropagation pass until we're satisfied with the accuracy:
Now, if we store all these decimal values of the weights in the disk, in some way, we are storing all the knowledge of this neural network to resolve the problem in hand. This...