'Symbolic Tensor issue when trying to perform LSTM modeling
I'm trying to build an LSTM model for out-of-sample time series forecasting. Below is my code.
model = Sequential()
model.add(LSTM(200, activation='relu', input_shape=(n_input, n_features)))
model.add(Dropout(0.15))
model.add(Dense(1))
optimizer = keras.optimizers.Adam(learning_rate=0.001)
model.compile(optimizer=optimizer, loss='mse')
history = model.fit_generator(generator,epochs=100,verbose=1)
However, I get this error message when execute the code.
Cannot convert a symbolic Tensor (lstm_5/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
I've read some other posts like this and some people suggest downgrading numpy or python. But neither of them are good options for me. Is there any other way to solve this problem?
Thank you in advance!
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
