'Performing Differentiation wrt input within a keras model for use in loss

Is there any layer in keras which calculates the derivative wrt input? For example if x is input, the first layer is say f(x), then the next layer's output should be f'(x). There are multiple question here about this topic but all of them involve computation of derivative outside the model. In essence, I want to create a neural network whose loss function involves both the jacobian and hessians wrt the inputs.

I've tried the following

import keras.backend as K

def create_model():

    x = keras.Input(shape = (10,))
    layer = Dense(1, activation = "sigmoid")
    output = layer(x)

    jac = K.gradients(output, x)
    
    model = keras.Model(inputs=x, outputs=jac)
    
    return model

model = create_model()
X = np.random.uniform(size = (3, 10))

This is gives the error tf.gradients is not supported when eager execution is enabled. Use tf.GradientTape instead.

So I tried using that

def create_model2():
    with tf.GradientTape() as tape:
        x = keras.Input(shape = (10,))
        layer = Dense(1, activation = "sigmoid")
        output = layer(x)

    jac = tape.gradient(output, x)
    
    model = keras.Model(inputs=x, outputs=jac)
    
    return model

model = create_model2()
X = np.random.uniform(size = (3, 10))

but this tells me 'KerasTensor' object has no attribute '_id'

Both these methods work fine outside the model. My end goal is to use the Jacobian and Hessian in the loss function, so alternative approaches would also be appreciated



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source