Unfairness as complex system failure
In this chapter, you have been equipped with an arsenal of technical tools to make machine learning models fairer. However, a model does not operate in a vacuum. Models are embedded in complex socio-technical systems. There are humans developing and monitoring the model, sourcing the data and creating the rules for what to do with the model output. There are also other machines in place, producing data or using outputs from the model. Different players might try to game the system in different ways.
Unfairness is equally complex. We've already discussed the two general definitions of unfairness, disparate impact and disparate treatment. Disparate treatment can occur against any combination of features (age, gender, race, nationality, income, and so on), often in complex and non-linear ways. This section examines Richard Cook's 1998 paper, How complex systems fail - available at https://web.mit.edu/2.75/resources/random/How%20Complex%20Systems%20Fail.pdf...