Forward propagation in ANNs
In this section, we will see how an ANN learns where neurons are stacked up in layers. The number of layers in a network is equal to the number of hidden layers plus the number of output layers. We don't take the input layer into account when calculating the number of layers in a network. Consider a two-layer neural network with one input layer, x, one hidden layer, h, and one output layer, y, as shown in the following diagram:
Figure 7.8: Forward propagation in ANN
Let's consider we have two inputs, x1 and x2, and we have to predict the output, . Since we have two inputs, the number of neurons in the input layer is two. We set the number of neurons in the hidden layer to four, and the number of neurons in the output layer to one. Now, the inputs are multiplied by weights, and then we add bias and propagate the resultant value to the hidden layer where the activation function is applied.
Before that, we need to initialize...