Chapter 10, Contrastive XAI
- Contrastive explanations focus on the features with the highest values that lead to a prediction. (True|False)
False. CEM focuses on the missing features.
- General practitioners never use contrastive reasoning to evaluate a patient. (True|False)
False. General practitioners often eliminate symptoms when assessing a patient's condition.
- Humans reason with contrastive methods. (True|False)
True.
- An image cannot be explained with CEM. (True|False)
False. CEM can use the output of a CNN and an autoencoder to produce explanations.
- You can explain a tripod using a missing feature of a table. (True|False)
True. It is the example given by the IBM Research team.
- A CNN generates good results on the MNIST dataset. (True|False)
True.
- A pertinent negative explains how a model makes a prediction with a missing feature. (True|False...