'How can i change Feature Importance so it isn't random anymore? in Python
I am doing feature selection using feature importance which is based on a decision tree.
#Feature importance (source: https://towardsdatascience.com/feature-selection-techniques-in-machine-learning-with-python-f24e7da3f36e)
X = df.iloc[:,1:31] #independent columns
y = df.iloc[:,0] #target column = diagnosis
model = ExtraTreesClassifier()
model.fit(X,y)
print(model.feature_importances_) #use inbuilt class feature_importances of tree based classifiers
feat_importances = pd.Series(model.feature_importances_, index=X.columns)
feat_importances.nlargest(10).plot(kind='barh', color="lightskyblue")
plt.show()
But the results change everytime i start it again, probably somewhere there is a random number somewhere. Is there a way to set this once and then keep it the same so that the results don't change everytime the kernel is restarted?
Thank you.
UPDATE: I think I managed it using: model = ExtraTreesClassifier(random_state=1) is that the right way to do it?
Solution 1:[1]
You can set a random_state
parameter of ExtraTreesClassifier
to some value. In this case you will get reproducible results.
For example:
model = ExtraTreesClassifier(random_state=0)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | CrafterKolyan |