We've talked about backpropagation and gradient descent in the context of example code in the first section of this chapter, but it can be hard to really understand the concepts at play when Gorgonia is doing a lot of the heavy lifting for us. So, we will now take a look at the actual process itself.
Gradient descent and backpropagation
Gradient descent
Backpropagation is how we really train our model; it's an algorithm we use to minimize the prediction error by adjusting our model's weights. We usually do this via a method called gradient descent.
Let's begin with a basic example—let's say we want to train a simple neural network to do the following, by multiplying a number by 0.5:
Input... |