Performing Automatic Model Tuning with the SageMaker XGBoost built-in algorithm
Hyperparameters are the properties of a machine learning algorithm that influence how the algorithm works and behaves. These properties are not learned and modified by the algorithm during the training step, and it is this key characteristic that makes it different from parameters. Hyperparameters must be specified before a training job starts while the parameters of a model are obtained when processing the training data during the training step. Hyperparameter optimization is the process of looking for the best configuration and combination of hyperparameter values that produce the best model.
That said, Automatic Model Tuning runs multiple training jobs with different hyperparameter configurations to look for the "best" version of a model.
Note
In this case, the best model is the model that yields the best objective metric. This objective metric depends on the problem being solved...