In TensorFlow, the most versatile method for freezing layers consists of removing their tf.Variable attributes from the list of variables passed to the optimizer:
# For instance, we want to freeze the model's layers with "conv" in their name:
vars_to_train = model.trainable_variables
vars_to_train = [v for v in vars_to_train if "conv" in v.name]
# Applying the optimizer to the remaining model's variables:
optimizer.apply_gradients(zip(gradient, vars_to_train))
In Keras, layers have a .trainable attribute, which can simply be set to False in order to freeze them:
for layer in feature_extractor_model.layers:
layer.trainable = False # freezing the complete extractor
Again, for complete transfer learning examples, we invite you to go through the Jupyter notebooks.