The Rectifier Linear (ReLU) function is one of the most popular functions in the field of ANNs. If the value is less than or equal to 0, then the value of x is set to 0, and then from there, it gradually progresses as the input value increases. We can observe this in the following diagram:
Fig 2.15: The rectifier function
In the next section, we will learn about the hyperbolic tangent activation function.