'logit model selection: Combination of backwards elimination and Karlson-Holm-Breen method?
I have a question regarding the compability of combining the backwards elimination method and the Karlson-Holm-Breen method. I am using R Studio. I have a logistic model with a lot of variables and would like to reduce it to one with fewer (the most powerful) variables. I read a bit about logit models and the problem of comparing them. A solution I found online was the Karlson-Holm-Breen method, but I didn't really find an example about how to get from the full model to the one with the reduced variables. My guess was to use the backwards elimination method until I find the model with the highest explanation power (or lowest AIC or highest pseudo R²) and then compare these two models with the KHB-method to assure the results. Is that the usual way to do it or is there a different method on how to get to the reduced model? Thanks so much for your help/advice, Franca
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
