Submitting tuning jobs in a local environment
Since the hyperparameter tuning process is inherently time-consuming, it is more practical to run it from a script rather than in a notebook environment. Also, although in a sense, a hyperparameter tuning process consists of multiple model training jobs, the tuner API and search workflow require a certain code refactoring style. The most obvious point is that we must wrap the model structure around a function (in our example, a function named model_builder
), whose signature indicates that hyperparameter arrays are expected to be referenced in the model structure.
You may find the code and instructions in the GitHub repository: https://github.com/PacktPublishing/learn-tensorflow-enterprise/blob/master/chapter_06/localtuningwork
With the help of the following code, we will set up user inputs or flags and perhaps assign default values to these flags when necessary. Let's have a quick review of how user inputs may be handled and...