We might want to keep the training instance limited to certain layers, which means some layers can be kept frozen for the training instance, so we can focus on optimizing other layers while frozen layers are kept unchanged. We saw two ways of implementing frozen layers earlier: using the regular transfer learning builder and using the transfer learning helper. In this recipe, we will implement frozen layers for transfer layers.
Implementing frozen layers
How to do it...
- Define frozen layers by calling setFeatureExtractor():
MultiLayerNetwork newModel = new TransferLearning.Builder(oldModel)
.setFeatureExtractor(featurizeExtractionLayer)
.build();
- Call fit() to start the training instance:
newModel.fit(numOfEpochs);...