'Fluctuating Accuracy/loss curves - LSTM keras

I am creating a LSTM model for human activity recognition and I keep always getting fluctuating but increasing Train and loss curves.

The following architecture gave these curves Train and loss curves :

model = Sequential()

model.add(LSTM(units = 128, return_sequences=True , input_shape=(3500, 11)))
model.add(Dropout(0.5))

model.add(Dense(units= 64, activation='relu'))

model.add(LSTM(units = 128, return_sequences=False , input_shape=(3500, 11)))
model.add(Dropout(0.5))

model.add(Dense(units= 64, activation='relu'))

model.add(Dense(4, activation='softmax'))

adam = tf.keras.optimizers.Adam(learning_rate=0.0020, beta_1=0.9, beta_2=0.999, 
                            epsilon=None, decay=0.0, amsgrad=False, clipnorm=1.)

model.compile(optimizer=adam ,loss='categorical_crossentropy',  metrics=['accuracy'])
history = model.fit(Gen, validation_data=val_Gen, epochs=30, callbacks=[tensorboard_callback], 
verbose=1).history

I tried changing the models architecture with different hyperparameters but nothing improved.

I am using TimeSeriesGenerator from keras to generate batches. Does anyone have a suggestion ?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source