Activation functions
Activation functions are the mechanisms by which an artificial neuron processes information and passes it throughout the network. The activation function takes a single number and performs a certain fixed mathematical functional mapping on it. There are many different types of activation functions. The most popular ones are the following:
- Sigmoid
- Tanh
- Relu
- Linear
Sigmoid function: Sigmoid has the mathematical form σ(x) = 1 / (1+e−x). It takes a real-valued number and squashes it into a range between 0 and 1. Sigmoid is a popular choice, which makes calculating derivatives easy and is easy to interpret.
Tanh function: Tanh squashes the real-valued number into the range [-1, 1]. The output is zero-centered. In practice, tanh non-linearity is always preferred to sigmoid non-linearity. Also, it can be proved that tanh is scaled sigmoid neuron tanh(x) = 2σ (2x) − 1.
Rectified Linear Unit (ReLU) function: ReLU has become very popular in the last few years. It computes the function...