So far, we've only focused on the imbalances in the class labels. In some situations, the imbalance in a particular feature may also be problematic. Say, historically, that the vast majority of the engineers in your company were men. Now, if you build an algorithm to filter the new applicants based on your existing data, would it discriminate against the female candidates?
The equal opportunity score tries to evaluate how dependent a model is of a certain feature. Simply put, a model is considered to give an equal opportunity to the different value of a certain feature if the relationship between the model's predictions and the actual targets is the same, regardless of the value of this feature. Formally, this means that the conditional probability of the predicted target, which is conditional on the actual target, and the applicant's gender should be the same, regardless of gender. These conditional probabilities are shown in the...