'SHAP global feature importance using Random forest regression

I am not sure why my mean(|SHAP|) values are different here. I was expecting the same numbers for both plots. I appreciate your suggestions.

explainer = shap.TreeExplainer(modelRF)
explainer.expected_value = explainer.expected_value[0] 
shap_values = explainer.shap_values(X_test)
cmap = plt.get_cmap("tab10")
shap.summary_plot(shap_valuesCV, X_test, plot_type="bar",color=cmap.colors[0])

enter image description here

explainer_rf2CV = shap.Explainer(modelCV, algorithm='tree')
shap_values_rf2CV = explainer_rf2(X_test)
shap.plots.bar(shap_values_rf2CV, max_display=10) # default is max_display=12

enter image description here



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source