Running Local Serving
A prerequisite to serving the model is serialization of the model structure and its assets, such as weights and biases matrices. A trained TensorFlow model is typically saved in a SavedModel
format. A SavedModel
format consists of the complete TensorFlow program with weights, biases, and computation ops. This is done through the low-level tf.saved_model
API.
Typically, when you execute a model training process using Fit, you end up with something like this:
mdl.fit( Â Â Â Â train_dataset, Â Â Â Â epochs=5, steps_per_epoch=steps_per_epoch, Â Â Â Â validation_data=valid_dataset, Â Â Â Â validation_steps=validation_steps)
After you've executed the preceding code, you have a model object, mdl
, that can be saved via the following syntax:
saved_model_path = '' tf.saved_model.save(mdl, saved_model_path)
If you take a look at the current directory, you will find a saved_model...