'How to output XGBoost output in log odds form in Python

I have a simple XGBClassifier

model = XGBClassifier()
    

which I use to fit a model (X are the predictive features, Y is the binary target):

model.fit(X, Y)

If I want to calculate the probabilities from the XGBClassifier model that I have just trained, then I use this code:

y_pred_proba = []
for i in range(len(X)):
    y_pred_proba.append(0)
    y_pred_proba[i] = model.predict_proba(X.iloc[[i]]).ravel()[1]

But how do I get the log(odds)? If I applied the following formula:

ln(odds) = ln(probability / (1-probability))

I'd get the odds ratio. I guess you cannot convert the probabilities to odds as simple as that. I guess you need a sigmoid function, right?

I understand that the default XGBClassifier objective function is a logistic regression. Is there a command to output the log(odds) of the XGBClassifier?

If I had fit a logistic regression like this:

import sklearn
model_adult = sklearn.linear_model.LogisticRegression(max_iter=10000)
model_adult.fit(X, Y)

Then I could have generated the log(odds) output through this code:

print(model_adult.predict_log_proba(X))

Is there anything similar with XGBClassifier?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source