As mentioned earlier, when calling a function decorated with tf.function for the first time, TensorFlow will create a graph corresponding to the function's operations. TensorFlow will then cache the graph so that the next time the function is called, graph creation will not be necessary.
To illustrate this, let's create a simple identity function:
@tf.function
def identity(x):
print('Creating graph !')
return x
This function will print a message every time TensorFlow creates a graph corresponding to its operation. In this case, since TensorFlow is caching the graph, it will print something only the first time it is run:
x1 = tf.random.uniform((10, 10))
x2 = tf.random.uniform((10, 10))
result1 = identity(x1) # Prints 'Creating graph !'
result2 = identity(x2) # Nothing is printed
However, note that if we change the input type, TensorFlow will recreate a graph:
x3 = tf.random.uniform((10, 10), dtype=tf.float16)
result3 = identity(x3) # Prints...