Summary
In this chapter, we explored various loss functions as remedies to class imbalance. We started with the class-weighting technique and deferred re-weighting, both designed to penalize errors on minority class samples. As we progressed, we encountered focal loss, where we shifted from class-centric to sample-centric weighting, focusing on the difficulty of samples. Despite its merits, we learned that focal loss may still be biased toward the majority class when assigning weights to challenging samples across all classes. Subsequent discussions on class-balanced loss, CDT loss, and class-wise difficulty-balanced loss were provided, each introducing unique strategies to dynamically adjust weights or modulate the model’s focus between easy and challenging samples, aiming to enhance performance on imbalanced datasets.
To summarize, algorithm-level techniques usually modify the loss functions used by the model in some way to accommodate for imbalances in the dataset. They...