Similar to the biological neuron structure, ANNs define the neuron as a central processing unit, which performs a mathematical operation to generate one output from a set of inputs. The output of a neuron is a function of the weighted sum of the inputs plus the bias. Each neuron performs a very simple operation that involves activating if the total amount of signal received exceeds an activation threshold, as shown in the following figure:
![](https://static.packt-cdn.com/products/9781788397872/graphics/assets/6b9ef2f1-dd0a-4a10-a6aa-fd4e976a9371.png)
The function of the entire neural network is simply the computation of the outputs of all the neurons, which is an entirely deterministic calculation. Essentially, ANN is a set of mathematical function approximations. We would now be introducing new terminology associated with ANNs:
- Input layer
- Hidden layer
- Output layer
- Weights
- Bias
- Activation functions