'Why a tensor object gets a new memory location after repeating the already executed operation?

I'm performing a simple addition operation using two tensors.

Image-1

Image-2

Image-3

In the above screenshots, it has been shown that even after performing the same operation twice different memory locations have been allocated to the same result every time, however, this is not the case with plain python code:

Image-4

So, is it happening because in TensorFlow for every operation it actually generates the computational graph and it doesn't really persist the results which means even when the same operation is executed twice it gives birth to two computational graphs which eventually results in two different memory locations?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source