'How to create a constant node in TensorFlow such that it is "frozen" during training?
My model is given by a computational graph in which 2D tensor of features and 1D tensor of targets are both represented by placeholders and model parameters are represented by nodes created with get_variable.
By running an optimizer on my data-set I get my model parameters updated.
Now I need to specify a computational graph (a function) in which some constants are not model parameters, meaning that they should not be changes during training. What is the proper way to do it?
Should I declare those nodes using placeholders and then during training I need to explicitly specify the values for those placeholders in feed_dict? Or maybe I need to use get_variable and to tell somehow that the created variable should be frozen during training?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
