Implementing model constraints
We will discuss next how to implement constraints first with XGBoost and all popular tree ensembles, for that matter, because the parameters are named the same (see Figure 12.12). Then, we will do so with TensorFlow Lattice. But before we move forward with any of that, let's remove race
from the data, as follows:
X_train_con = X_train.drop(['race'], axis=1).copy() X_test_con = X_test.drop(['race'], axis=1).copy()
Now, with race
out of the picture, the model left to its own devices may still have some bias. However, the feature engineering we performed and the constraints we will place can help align the model against them, given the double standards we found in Chapter 7, Anchor and Counterfactual Explanations. That being said, the resulting model might perform worse against the test data. There are two reasons for this, outlined here:
- Loss of information: Race, especially through interaction with other features...