'.fill_between returns ValueError: 'y1' is not 1-dimensional
I am programming a GPR (Gaussian Process Regression) and would like to visualize it. I imported data from an excel file and now I would like to fill the area under and above the graph in a certain interval.
This is the code i wrote:
X_ = np.linspace(X.min()-5, X.max() + 15, 1000)[:, np.newaxis]
y_pred, y_std = gpr.predict(X_, return_std = True)
fig = plt.figure(figsize = (15,10))
plt.scatter(X, y, c = 'k', alpha = 0.55)
plt.plot(X_, y_pred)
plt.fill_between(X_[:,0], y_pred-y_std, y_pred+y_std, alpha = 0.5, color = 'k')
plt.xlim(X_.min(), X_.max())
plt.xlabel('Temperature [°C]')
plt.ylabel('fd [-]')
plt.title('fd depending on the Temperature')
plt.show()
Every time I execute the program I get a value error (y1 is not 1-dimensional) for this part of the code:
plt.fill_between(X_[:,0], y_pred-y_std, y_pred+y_std, alpha = 0.5, color = 'k')
There seems to be a problem with the "y_pred" values. When I substitute "y_pred" for a number, then it works just fine.
I would really appreciate any help I can get. Thank you in advance.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
