'Difference between r2_score and score() in linear regression
I found the results of score() in LinearRegression is different from r2_score(). I expected them to return the same results.The codes are as below:
r2_train = np.empty(shape=[10, 0])
r2_train_n = np.empty(shape=[10, 0])
for set_degree in range (0,10):
pf = PolynomialFeatures(degree= set_degree)
X_train_tf = pf.fit_transform(X_train.reshape(11,1))
X_test_tf = pf.transform(X_test.reshape(4,1))
lr = LinearRegression().fit(X_train_tf, y_train)
r2_train = np.append(r2_train, r2_score(lr.predict(X_train_tf), y_train))
r2_train_n = np.append(r2_train_n, lr.score(X_train_tf, y_train))
Solution 1:[1]
In using r2_score, you made it:
r2_score(lr.predict(X_train_tf), y_train)
According to the documentation, the first argument should be the true values, i.e. it should be:
r2_score(y_train, lr.predict(X_train_tf))
This will give similar result with the score method in LinearRegression()
Same question here
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Ayenew Yihune |
