Generating model explanations
Another key capability of DataRobot is that it automatically generates instance-level explanations for each prediction. This is important in understanding why a particular prediction turned out the way it did. This is not only important for understanding the model; many times, this is needed for compliance purposes as well. I am sure you have seen explanations generated or offered if you are denied credit. The ability to generate these explanations is not straightforward and can be very time-consuming. Let's first look at the explanations generated for the XGBoost model, as shown in the following screenshot:
Since we selected the SHAP option for this project, the model explanations are based on SHapley Additive exPlanations (SHAP) algorithms. Here, you can see the overall distribution of predictions on the left, and you can see that most of the dataset lies in the range of 0
to 10000...