Cost function and errors
The cost function given the predicted probabilities by the model is as follows:
cost = -T.mean(T.log(model)[T.arange(y.shape[0]), y])
The error is the number of predictions that are different from the true class, averaged by the total number of values, which can be written as a mean:
error = T.mean(T.neq(y_pred, y))
On the contrary, accuracy corresponds to the number of correct predictions divided by the total number of predictions. The sum of error and accuracy is one.
For other types of problems, here are a few other loss functions and implementations:
Categorical cross entropy An equivalent implementation of ours |
T.nnet.categorical_crossentropy(model, y_true).mean()
|
Binary cross entropy For the case when output can take only two values {0,1} Typically used after a sigmoid activation predicting the probability, p | T.nnet.binary_crossentropy(model, y_true).mean() |
Mean squared error L2 norm for regression problems | T.sqr(model – y_true).mean() |
Mean absolute... |