As we mentioned in the Automatic hyperparameter tuning section, SageMaker has a library for smart parameter tuning using Bayesian optimization. In this section, we will demonstrate how we can further tune the model we created in Chapter 4, Predicting User Behavior with Tree-Based Methods. Recall from that chapter that we posed a binary classification problem for trying to predict whether a user would click on an advertisement. We used an xgboost model, but at that point, we hadn't performed any parameter tuning.
We will start by creating the SageMaker session and choosing xgboost:
import boto3
import sagemaker
from sagemaker import get_execution_role
sess = sagemaker.Session()
role = get_execution_role()
container = sagemaker.amazon.amazon_estimator.get_image_uri('us-east-1', "xgboost", "latest")
s3_validation_data...