Assessing feature importance with model-agnostic methods
Model-agnostic methods imply that we will not depend on intrinsic model parameters to compute feature importance. Instead, we will consider the model as a black box, with only the inputs and output visible. So, how can we determine which inputs made a difference?
What if we altered the inputs randomly? Indeed, one of the most effective methods for evaluating feature importance is through simulations designed to measure a feature’s impact or lack thereof. In other words, let’s remove a random player from the game and observe the outcome! In this section, we will discuss two ways to achieve this: permutation feature importance and SHAP.
Permutation feature importance
Once we have a trained model, we cannot remove a feature to assess the impact of not using it. However, we can:
- Replace the feature with a static value, such as the mean or median, rendering it devoid of useful information. ...