Simple perceptron – a linear separable classifier
As we saw, a simple perceptron is a single layer neural unit which is a linear classifier. It is a neuron capable of producing only two output patterns, which can be synthesized in active or inactive. Its decision rule is implemented by a threshold behavior: if the sum of the activation patterns of the individual neurons that make up the input layer, weighted for their weights, exceeds a certain threshold, then the output neuron will adopt the output pattern active. Conversely, the output neuron will remain in the inactive state.
As mentioned, the output is the sum of weights*inputs and a function applied on top of it; output is +1 (y>0) or -1(y<=0), as shown in the following figure:
We can see the linear interaction here; the output y is linearly dependent on the inputs.
As with most neural network models, it is possible to realize a learning function based on the modification of synaptic connective weights, even in perceptors. At the...