Creating custom activations for tabular data
With images and text, it is more difficult to backpropagate errors in DNNs working on tabular data because the data is sparse. While the ReLU activation function is used widely, new activation functions have been found to work better in such cases and can improve the network performances. These activations functions are SeLU, GeLU, and Mish. Since SeLU is already present in Keras and TensorFlow (see https://www.tensorflow.org/api_docs/python/tf/keras/activations/selu and https://www.tensorflow.org/api_docs/python/tf/nn/selu), in this recipe we'll use the GeLU and Mish activation functions.
Getting ready
You need the usual imports:
from tensorflow import keras as keras
import numpy as np
import matplotlib.pyplot as plt
We've added matplotlib
, so we can plot how these new activation functions work and get an idea of the reason for their efficacy.
How to do it…
GeLU and Mish are defined by their mathematics...