'Is XGBoost effective for variable selection?
I have understood the use of XGBoost, I got it this was an amateur question
Can XGBoost be used for variable elimination & selection purpose like LASSO, or we need to use LASSO first to eliminate variables & then use XGBoost finally to get prediction?
Solution 1:[1]
XGBoost is quite effective for prediction in the presence of redundant variables (features). As underlying gradient boosting algorithm itself is robust to multi-collinearity.
But it is highly recommended to remove (engineer) any redundant features from any dataset used for training for any algorithm of choice (whether LASSO or XGBoost).
Additionally you can combine those two method using Ensemble learning.
Solution 2:[2]
xgboost has built-in regularization(Like LASSO) method when you training.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Artem |
| Solution 2 | ping George |
