Learning in neural networks
As we saw in Chapter 1, Neural Network and Artificial Intelligence Concepts, neural networks is a machine learning algorithm that has the ability to learn from data and give us predictions using the model built. It is a universal function approximation, that is, any input, output data can be approximated to a mathematical function.
The forward propagation gives us an initial mathematical function to arrive at output(s) based on inputs by choosing random weights. The difference between the actual and predicted is called the error term. The learning process in a feed-forward neural network actually happens during the backpropagation stage. The model is fine tuned with the weights by reducing the error term in each iteration. Gradient descent is used in the backpropagation process.
Let us cover the backpropagation in detail in this chapter, as it is an important machine learning aspect for neural networks.