The perceptron
The perceptron was the name that Frank Rosenblatt gave to the first neural model in 1957. A perceptron is a neural network with a single layer of input linear neurons, followed by an output unit based on the sign(x) function (alternatively, it's possible to consider a bipolar unit whose output is -1 and 1). The architecture of a perceptron is shown in the following diagram:

Structure of a perceptron
Even though the diagram might appear quite complex, a perceptron can be summarized by the following equation:

All the vectors are conventionally column-vectors; therefore, the dot product transforms the input into a scalar, then the bias is added, and the binary output is obtained using the step function, which outputs 1 when z > 0 and 0 otherwise. At this point, a reader could object that the step function is non-linear; however, a non-linearity applied to the output layer is only a filtering operation that has no effect on the actual...