The neural network performance, as you must have observed by now, depends on a lot on hyperparameters. Thus, it becomes important that one gains an understanding as to how these parameters affect the network. Common examples of hyperparameters are learning rate, regularizers, regularizing coefficient, dimensions of hidden layers, initial weight values, and even the optimizer selected to optimize weights and biases.
Tuning hyperparameters
How to do it...
Here is how we proceed with the recipe:
- The first step in tuning hyperparameters is building the model. Build the model in TensorFlow, exactly the way we have been.
- Add a way to save the model in model_file. In TensorFlow, this can be done using a Saver object. Then save it...