Chapter 6, AI Fairness with Google's What-If Tool (WIT)
- The developer of an AI system decides what is ethical or not. (True|False)
True. The developer is accountable for the philosophy of an AI system.
False. Each country has legal obligation guidelines.
- A DNN is the only estimator for the COMPAS dataset. (True|False)
False. Other estimators would produce good results as well, such as decision trees or linear regression models.
- Shapley values determine the marginal contribution of each feature. (True|False)
True. The values will show if some features are making the wrong contributions, for example.
- We can detect the biased output of a model with a SHAP plot. (True|False)
True. We can visualize the contribution of biased data.
- WIT's primary quality is "people-centered." (True|False)
True. People-centered AI systems will outperform AI systems that have no human...