The model generated by tfjs-converter can be finally loaded by TensorFlow.js. TensorFlow.js provides us with some dedicated APIs to load the specific model format, loadGraphModel and loadLayersModel. If the model file is created from SavedModel, loadGraphModel can be used. On the other hand, if the original model is Keras, loadLayersModel should be used. These APIs are able to load the model via both HTTP and a local filesystem:
import * as tf from '@tensorflow/tfjs'; const MODEL_URL = 'https://path/to/model.json'; const model = await tf.loadGraphModel(MODEL_URL);
// Or
const MODEL_PATH = 'file://path/to/model.json';
const model = await tf.loadGraphModel(MODEL_PATH);
Model loading can be done asynchronously to prevent the loading of a large-size model from blocking the main thread. While browsers usually support...