Rectified linear unit (ReLU)
The output for the first two activation functions presented in this chapter was binary. That means that they will take a set of input variables and convert them into binary outputs. ReLU is an activation function that takes a set of input variables as input and converts them into a single continuous output. In neuralneural networks, ReLU is the most popular activation function and is usually used in the hidden layers, where we do not want to convert continuous variables into category variables.The following diagram summarizes the ReLU activation function:
Note that when x≤ 0, that means y = 0. This means that any signal from the input that is zero or less than zero is translated into a zero output:
As soon as x becomes more than zero, it is x.The ReLU function is one of the most used activation functions in neural neural networks. It can...