'Why the value of a variable inside a tensorflow graph not frozen

I'm trying to save and load the graph of a tensorflow module that contains a tf.Variable as an intern variable.

Here is the code:

import tensorflow as tf

class MyModule(tf.Module):
    def __init__(self, v):
        self.v = v
        pass

    @tf.function(input_signature=[tf.TensorSpec(shape=None, dtype=tf.int32), tf.TensorSpec(shape=None, dtype=tf.int32)])
    def __call__(self, x, v):
        self.v.assign( x * v, read_value=False )
        return self.v

x = tf.constant( tf.random.uniform(shape=[2,1], maxval=3, dtype=tf.int32) )
v = tf.Variable([[1], [2]])
module = MyModule(v)


#############################################
x = tf.constant( tf.random.uniform(shape=[3,1], maxval=3, dtype=tf.int32) )
v = tf.Variable([[1], [2], [3]])
module = MyModule(v)

tf.saved_model.save(module, "module")     

imported = tf.saved_model.load("module")

x = tf.constant([80,0,20,24,321])
v = tf.Variable(3*tf.ones_like(x), trainable=False)

result = imported(x,v)
print(result)                                                       

The output is this: tf.Tensor([240 0 60 72 963], shape=(5,), dtype=int32)

My question is the following: Given that the graph has been saved, why the value of the variable self.v can still be changed. Isn't is supposed to be frozen.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source