Evaluating trained models
In this section, we will take the already trained model from the previous section and launch a batch inference job on the test data. The first step here will be to load our test data into a Jupyter Notebook:
from io import BytesIO import numpy as np from tensorflow.python.lib.io import file_io dest = 'gs://data-bucket-417812395597/' test_x = np.load(BytesIO(file_io.read_file_to_string(dest+'test_x',\ binary_mode=True))) test_y = np.load(BytesIO(file_io.read_file_to_string(dest+'test_y',\ binary_mode=True))) print(test_x.shape, test_y.shape)
The next step is to create a JSON payload of instances from our test data and save it in a cloud storage location. The batch inference module will be able to read these instances and perform inference:
import json BATCH_PREDICTION_INSTANCES_FILE = "batch_prediction_instances.jsonl" BATCH_PREDICTION_GCS_SOURCE = ( ...