'How to calculate subgradients
cost = np.maximum(x 0)
RuntimeError: You should not call `__bool__` / `__nonzero__` on `Formula`. If you are trying to make a map with `Variable`, `Expression`, or `Polynomial` as keys (and then access the map in Python), please use pydrake.common.containers.EqualToDict`.
I got the above error when trying to do: Evaluate(cost.Jacobian(vars), dic)
Are subgradients supported? How to do it?
EDIT:
Evaluate(cost_fnx.Jacobian(x_var), dic)
*** RuntimeError: The following environment does not have an entry for the variable x(0)
x(1) -> 2
x(0) -> 2
When I loop over the scalars of my vector x, and use symbolic.max, I get a runtime error - I assume because it's not supposed to work?
When I try the AutoDiff method, I get:
(Pdb) cost_fnx(InitializeAutoDiff(x_sample))
*** TypeError: 'pydrake.symbolic.Expression' object is not callable
My cost function is a very complicated function made of prog.NewContinuousVariable variables, not just the max. My guess is maybe I can use the AutoDiff max and combine it with the rest of my NewContinuousVariable cost function somehow?
Solution 1:[1]
I assume x is a symbolic variable. Then instead of using cost = np.maximum(x, 0), you can call
cost = pydrake.symbolic.max(x, 0)
This will create a proper symbolic expression.
BTW, I would recommend not to use symbolic evaluation in cost function. A better way is to define a function as below, and you can compute the gradient through using autodiffscalar (an implementation of automatic differentiation)
def cost(x):
if x.dtype == float:
return np.maximum(x, 0)
elif x.dtype == object:
return x if x >= 0 else 0 * x
# Compute the gradient of cost at x = -2
Print(ExtractGradient(cost(InitializeAutoDiff([-2])))) # This prints 0
# Compute the gradient of cost at x=1
Print(ExtractGradient(cost(InitializeAutoDiff([1])))) # This prints 1
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Hongkai Dai |
