'TensorFlow: What is the effect of calling tf.random.set_seed() twice, where the second function call is passed a hard-coded value?

I'm using someone else's code base and in one spot (early on in execution), the tensorflow seed is set via tf.random.set_seed(seed), where seed is provided via command line argument. But then a bit later in execution, they set it again with tf.random.set_seed(0).

What is the effect of setting the seed a second time with a hard-coded constant?

Does it mean that everything which happens after the second call will be identical, even for different seeds?



Solution 1:[1]

I realized checking myself yields a faster answer than waiting. For anyone else wondering, the answer is yes.

import tensorflow as tf


for seed in range(3):
    print(seed)
    tf.random.set_seed(seed)
    print(tf.random.uniform(shape=(3, 2)))
    tf.random.set_seed(0)
    print(tf.random.uniform(shape=(3, 2)))

The second tensor will always be the same.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Rylan Schaeffer