The artificial neuron
Using our biological analogy, we can construct a model of a computational neuron, and this model is known as the McCulloch-Pitts model of a neuron:

Note
Warren McCulloch and Walter Pitts proposed this model of a neural network as a computing machine in a paper titled A logical calculus of the ideas immanent in nervous activity, published by the Bulletin of Mathematical Biophysics in 1943.
This computational neuron is the simplest example of a neural network. We can construct the output function, y, of our neural network directly from following our diagram:

The function g()
in our neural network is the activation function. Here, the specific activation function that is chosen is the step function:

When the linear weighted sum of inputs exceeds zero, the step function outputs 1, and when it does not, the function outputs -1. It is customary to create a dummy input feature x0 which is always taken to be 1, in order to merge the bias or threshold w0 into the main sum as follows...