In TensorFlow 1, variables were created globally. Each variable had a unique name and the best practice in terms of creating them was to use tf1.get_variable():
weights = tf1.get_variable(name='W', initializer=[3])
Here, we created a global variable named W. Deleting the Python weights variable (using the Python del weights command, for instance) would have no effect on TensorFlow memory. In fact, if we try to create the same variable again, we would end up with an error:
Variable W already exists, disallowed. Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?
While tf1.get_variable() allows you to reuse variables, its default behavior is to throw an error if a variable with the chosen name already exists, preventing you from mistakenly overriding variables. To avoid this error, we can update our call to tf1.variable_scope(...) and employ the reuse argument:
with tf1.variable_scope("conv1", reuse=True):
weights = tf1...