'Weird prediction after MinMax Scaler
I am trying to make forecasting with LSTM but when I do MinMax Normalization my prediction is being terrible. When I check autocorrelation, data looks stationary before and after MinMax normalization, but still result is not acceptable. (i tried almost every activation functions and many lstm layers too)
My LSTM archtiecture :
EPOCHS = 1000
steps = int( np.ceil(x_train_multi.shape[0] / batch_size) )
val_steps=int( np.ceil(x_val_multi.shape[0] / batch_size) )
multi_step_model = tf.keras.models.Sequential()
multi_step_model.add(LSTM(128,activation="relu",return_sequences=False,input_shape=x_train_multi.shape[-2:]))
multi_step_model.add(Dropout(0.4))
multi_step_model.add(tf.keras.layers.Dense(future_target)) # for 72 outputs
multi_step_model.compile(optimizer=tf.keras.optimizers.Adam(), loss='mae')
multi_step_model.summary()
print(train_data_multi)
multi_step_history = multi_step_model.fit(train_data_multi, epochs=EPOCHS,
steps_per_epoch=steps,
validation_data=val_data_multi,
validation_steps=val_steps)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
