The inspiration for neural networks or multilayer perceptrons is the human brain and nervous system. At the heart of our nervous system is the neuron pictured above the computer analog, which is a perceptron:
The neurons in our brain collect input, do something, and then spit out a response much like the computer analog, the perceptron. A perceptron takes a set of inputs, sums them all up, and passes them through an activation function. That activation function determines whether to send output, and at what level to send it when activated. Let's take a closer look at the perceptron, as follows:
On the left-hand side of the preceding diagram, you can see the set of inputs getting pushed in, plus a constant bias. We will get more into the bias later. Then the inputs are multiplied...