There are two ways of storing a trained neural network for future use and then restoring it. We will see that they enable this in the convolutional neural network example.
The first one lives in tf.train. It is created with the following statement:
saver = tf.train.Saver(max_to_keep=10)
And then each training step can be saved with:
saver.save(sess, './classifier', global_step=step)
Here the full graph is saved, but it is possible to only save part of it. We save it all here, and only keep the last 10 saves, and we postfix the name of the save with the step we are at.
Let's say that we saved the final training step with saver.save(sess, './classifier-final'). We know we first have to restore the graph state with:
new_saver = tf.train.import_meta_graph("classifier-final.meta")
This didn't restore the variable...