Activation Functions
As seen previously, a single neuron needs to perform a transformation by applying an activation function. Different activation functions can be used in neural networks. Without these functions, a neural network would simply be a linear model that could easily be described using matrix multiplication.
The activation function of a neural network provides non-linearity and therefore can model more complex patterns. Two very common activation functions are sigmoid
and tanh
(the hyperbolic tangent function).
Sigmoid
The formula of sigmoid
is as follows:
The output values of a sigmoid
function range from 0 to 1. This activation function is usually used at the last layer of a neural network for a binary classification problem.
Tanh
The formula of the hyperbolic tangent is as follows:
The tanh
activation function is very similar to the sigmoid
function...