'XGBoost best max_depth=1

I use xgboost to train a classification model. GridCVSearch gives the best max_depth=1. This means all my hundreds of trees are split at a single node.

Does this mean that the problem/dataset that I work with is separable using simple models and I don't need to use complicated model such as xgboost?

In general, if all the trees have depth 1, does xgboost provide better prediction than simple models such as SVM/logistic regression?

Thanks!



Solution 1:[1]

You arrive at ADABoost not even close to regular decision trees

ADABoost: Each tree has a depth of 1(weak learners)

Its about the your features not necessarily model, if you have few features then it does not make sense to go so deep in each tree

Solution 2:[2]

If Depth is 1 means you arrived with a regular decision tree and losts the advantage of XGBOOST.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 CheeseBurger
Solution 2 Sapan Gupta