Demystifying prediction explanation techniques
Prediction explanations is a technique that attempts to explain the logic behind a model’s decision, given input data. Some machine learning models are built to be more transparent and explainable out of the box. One example is a decision tree model, which is built from the ground up using explicit conditioning rules to split data into multiple partitions that result in specific decisions, allowing the predictions to be explained through the explicit rules that were used to predict the data sample. However, models such as neural networks are treated like a black box without any straightforward way to retrieve the reasons the decision was made directly.
The logic of a model’s decision on a data sample can be explained and presented in a variety of ways, so long it contains information on how the final decision was made. Additionally, predictions made by a machine learning model can be explained in either a model-agnostic...