'"ValueError: No gradients provided for any variable: ['Variable:0']." in Tensorflow2.0
I want update a variable parameter "mus" by defining a loss "loss_mu" and use optimizer.adam to optimize it, I encountered a issue : "ValueError: No gradients provided for any variable: ['Variable:0']."
accs = []
max_acc = 0.9
loss_mu = 0
M = 6
sigma = 0.25
optim_mus = tf.keras.optimizers.Adam(lr=0.05)
mus = tf.Variable(tf.convert_to_tensor(np.concatenate([np.ones([6, ]), np.zeros([6, ])])), dtype=tf.float64)
dist = tfp.distributions.MultivariateNormalDiag(mus, tf.cast(np.ones(2 * M) * sigma, dtype=tf.float64))
thetas = dist.sample((4,))
for i in range(4):
max_acc = dict_m['Max_acc{}'.format(i)]
acc = dict_m['acc{}'.format(i)]
accs += acc
loss_mu -= dist.log_prob(thetas[i]) * (max_acc - np.mean(accs)) / (np.std(accs) + np.finfo(np.float32).eps.item())
loss_mu = loss_mu/self.B
with tf.GradientTape() as Tape:
grad = Tape.gradient(loss_mu, [mus])
optim_mus.apply_gradients(zip(grad, [mus,]))
when I print grad, I found it is [None], I am new to tensorflow2.0 and don't know how to fix it
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
