Sometimes (for example, when using pretrained networks), it is desirable to freeze some of the layers. We can do this when we're sure that some of the layers most of the time the first couple of layers, also known as the bottom of the network have proven to be of value as feature extractors. In the following recipe, we will demonstrate how to freeze a part of the network after training and only train the remaining subset of the network.
Freezing layers
How to do it...
- First, we load all libraries as follows:
import tensorflow as tf
from tensorflow.examples.tutorials.mnist import input_data
- In TensorFlow, it's straightforward to load the MNIST dataset:
mnist = input_data.read_data_sets('Data/mnist', one_hot...