Using keras, you can use a number of different activation functions. Some of these have been discussed in previous chapters; however, there are some that have not been previously covered. We can begin by listing the ones we have already covered with a quick note on each function:
- Linear: Also known as the identity function. Uses the value of x.
- Sigmoid: Uses 1 divided by 1 plus the exponent of negative x.
- Hyperbolic tangent (tanh): Uses the exponent of x minus the exponent of negative x divided by x plus the exponent of negative x. This has the same shape as the sigmoid function; however, the range along the y-axis goes from 1 to -1 instead of from 1 to 0.
- Rectified Linear Units (ReLU): Uses the value of x if x is greater than 0; otherwise, it assigns a value of 0 if x is less than or equal to 0.
- Leaky ReLU: Uses the same formula...