Summary
The theory that gave birth to neural networks was developed decades ago by Frank Rosenblatt. It started with the definition of the perceptron, a unit inspired by the human neuron, that takes data as input to perform a transformation on it. The theory behind the perceptron consisted of assigning weights to input data to perform a calculation so that the end result would be either one thing or the other, depending on the outcome.
The most widely known form of neural networks is the one that's created from a succession of perceptrons, stacked together in layers, where the output from one column of perceptrons (layer) is the input for the following one.
The typical learning process for a neural network was explained. Here, there are three main processes to consider: forward propagation, the calculation of the loss function, and backpropagation.
The end goal of this procedure is to minimize the loss function by updating the weights and biases that accompany each of...