'Restrict features weights in GradientBoostingClassifier
I am trying to create a classifier from a dataset of ~30 features, using sklearn's GradientBoostingClassifier, with 100 estimators and max_depth=3.
When I look at the feature importance I see that 2 features catches ~90% of the importance.
Is there a way to restrict the maximal importance of a feature? Of course a loss increase will be necessary.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
