'Scores of ExhaustiveFeatureSelector , SequentialFeatureSelector negative with SVR

I was trying feature selector on load_diabetes dataset from sklearn.datasets. I tried feature selection using 3 techniques SequentialFeatureSelector(both forward and backward) and ExhaustiveFeatureSelector with SupportVectorRegressor as an estimator.

But everytime the best score is a huge negative value: Here df is the dataframe with feature , y is a numpy array of target and svr is object of sklearn.svm.SVR() Exhaustive feature selector

#ExhaustiveFeatureSelection
from mlxtend.feature_selection import ExhaustiveFeatureSelector

 efs=ExhaustiveFeatureSelector(svr,min_features=1,max_features=7,scoring='neg_mean_squared_error',cv=2,n_jobs=-1)
Xefs=efs.fit_transform(df,y)
print(efs.best_score_)
print(efs.best_feature_names_)

Output: Features: 967/967

-4735.324434228489

('bmi', 's5')

Forward SequentialFeatureSelector

#forward sequentialfeature selection
from mlxtend.feature_selection import SequentialFeatureSelector
sfs=SequentialFeatureSelector(svr,k_features(1,7),forward=True,scoring='neg_mean_squared_error',cv=2)
Xsfs=sfs.fit_transform(df,y)
print(sfs.k_feature_names_)
sfs.k_score_

Output: ('bmi', 's5')

-4735.324434228489

Backward SequentialFeatureSelector

#backward sequentialfeature selection
from mlxtend.feature_selection import SequentialFeatureSelector`
  sbs=SequentialFeatureSelector(svr,k_features(1,7),forward=False,scoring='neg_mean_squared_error',cv=2)
Xsbs=sbs.fit_transform(df,y)
print(sbs.k_feature_names_)
sbs.k_score_

Output: ('bmi', 's5')

-4735.324434228489

Can anyone please tell me what exactly am i doing wrong?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source