Random oversampling
The simplest strategy to balance the imbalance in a dataset is to randomly choose samples of the minority class and repeat or duplicate them. This is also called random oversampling with replacement.
To increase the number of minority class observations, we can replicate the minority class data observations enough times to balance the two classes. Does this sound too trivial? Yes, but it works. By increasing the number of minority class samples, random oversampling reduces the bias toward the majority class. This helps the model learn the patterns and characteristics of the minority class more effectively.
We will use random oversampling from the imbalanced-learn
library. The fit_resample
API from the RandomOverSampler
class resamples the original dataset and balances it. The sampling_strategy
parameter is used to specify the new ratio of various classes. For example, we could say sampling_strategy=1.0
to have an equal number of the two classes.
There are...