'Is it possible to differentiate a black box model in Python?

Is it possible to find the gradient of a black box model? For instance, given the following trained random forest model

from sklearn.datasets import make_regression
from sklearn.ensemble import RandomForestRegressor

# define dataset
X, y = make_regression(n_samples=1000, n_features=5, n_informative=3, noise=0.1, random_state=2)

model = RandomForestRegressor()
model.fit(X, y)

I would like to find the derivatives with respect to all the variables of this function

def fun(x1, x2, x3, x4, x5):
    a = model.predict([[x1, x2, x3, x4, x5]])[0]
    return(a)

I was wandering if it possible to do with (for instance) autograd something similar to

from autograd import grad, jacobian

grad(fun, 0)(-0.89,-1.06,-0.25,-0.53,0.21)


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source