Questions
- Mean false error and mean squared false error:
Wang et al. [16]proposed that regular loss functions poorly capture the errors from minority classes in the case of high data imbalance due to lots of negative samples that dominate the loss function. Hence, they proposed a new loss function where the main idea was to split the training error into four different kinds of errors:
- False Positive Error (FPE) = (1/number_of_negative_samples) * (error from negative samples)
- False Negative Error (FNE) = (1/number_of_positive_samples) * (error from positive samples)
- Mean False Error (MFE) = FPE+ FNE
- Mean Squared False Error (MSFE) = FPE2 + FNE2
The error here could be computed using the usual cross-entropy loss or any other loss used for classification. Implement the MFE and MSFE loss functions for both the imbalanced MNIST and CIFAR10-LT datasets, and see whether the model performance improves over the baseline of cross-entropy loss.
- In this chapter, while implementing the CDT loss...