For understanding classification with DNNs, we first have to understand the concept of exponential linear unit function and the elements of the model.
Classification with DNNs
Exponential linear unit activation function
The Exponential Linear Unit (ELU) function is a relatively recent modification to the ReLU function. It looks very similar to the ReLU function, but it has very different mathematical properties. The following screenshot shows the ELU function:
The preceding screenshot shows that, at 0, we don't have a corner. In the case of the ReLU function, we have a corner. In this function, instead of a single value going to 0, we have the ELU function slowly going to the negative alpha parameter.