'ValueError: No gradients provided for any variable in semi/self supervised loss function

I am training a neural network for clustering applications in a semi/self-supervised way: Instead of having the ground truth, I define the loss function by calculating the similarity among the data points assigned to the same clusters, like:

def loss_function(self, y_true, y_pred):
    def get_loss(x_input, y_input):
        similarity = 0
        for i in range(len(np.unique(y_input))):
            similarity += sum(pdist(x_input))
        return similarity
    score = tf.numpy_function(get_loss, [self.x_input, y_pred], tf.float32)
    return score

In calculating the loss, I don't use y_true, and instead, I use self.x_input, which is the original data point.

I'm getting the following error while running my code:

raise ValueError(f"No gradients provided for any variable: {variable}. "
ValueError: No gradients provided for any variable

So my question is it possible to train a neural network model in this way (without having ground truth)? If so, what is causing the above problem?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source