Chapter 7 – Decreasing Bias and Achieving Fairness
- No. There might be proxies in our models for sensitive attributes, but not in our models.
- Salary and income (in some countries), occupation, a history of a felony charge.
- Not necessarily. Satisfying fairness according to demographic parity wouldn’t necessarily result in a model being fair according to equalized odds.
- Demographic parity is a group fairness definition to ensure that a model’s predictions are not dependent on a given sensitive attribute, such as ethnicity or sex.
Equalized odds is satisfied when a given prediction is independent of the group of a given sensitive attribute and the real output.
- Not necessarily. For example, there could be feature proxies for
'sex'
among top contributors to model predictions. - We can use explainability techniques to identify potential biases in our models and then plan to improve them toward fairness. For example, we can identify...