'How to compute hessian in tensorflow 2.0?

https://www.tensorflow.org/api_docs/python/tf/hessians

Typical approach should be to do

with tf.GradientTape() as tape_:
    with tf.GradienTape() as tape:
        loss = ...
    g = tape.gradient(loss, [vars])
gg = tape_.gradient(loss, [tf.transpose(vars)])

But of course the transpose does not work like that through the tape.

tf.hessian has no example in the docs. I think it might be from tf 1.0.

UPDATE: use tf.jacobian

with tf.GradientTape() as tape_:
    with tf.GradientTape() as tape:
        loss = J(a_orig, r)
    dJda = tape.jacobian(loss, [a_orig])[0]
s = tape_.jacobian(dJda, [a_orig])[0]


Solution 1:[1]

As listed in the update.

Use tf.jacobian instead of tf.gradient when the same variable is used to differentiate again.

with tf.GradientTape() as tape_:
    with tf.GradientTape() as tape:
        loss = J(a_orig, r)
    dJda = tape.jacobian(loss, [a_orig])[0]
s = tape_.jacobian(dJda, [a_orig])[0]

Solution 2:[2]

You can even make a nice wrapper for it

def hessian_wrapper(fn):
  def inner(*args, **kwargs):
    with tf.GradientTape() as tape1:
      with tf.GradientTape() as tape2:
        x, y = fn(*args, **kwargs)
      d1 = tape2.jacobian(y, [x])[0]
    s = tape1.jacobian(d1, [x])[0]
    return s
  return inner

def fn(x):
  return x * x

@hessian_wrapper
def foo(x, y_):
  y = fn(x)
  loss = tf.nn.l2_loss(y_ - y)
  return x, loss

x = tf.Variable([2,2],dtype=tf.float32)
y_ = tf.Variable([9,9],dtype=tf.float32)

foo(x, y_)
'''
Output

<tf.Tensor: shape=(2, 2), dtype=float32, numpy=
array([[6., 0.],
       [0., 6.]], dtype=float32)>
'''

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 mathtick
Solution 2 Souradeep Nanda