'Should I use LSTM or Dense or Some other Keras layer for time series data input layer?

I have a tabular data of [10000 rows x 35 columns], each row depicting daily observations of 5 departments each providing 7 features. Thus, total 35 features are there as observation per day wrt the growth output (1 numeric value) for every 10000 days.

These 35 features may have a linear or non-linear mapping with response output on daily basis. I want to map it via NN and came across LSTM (for time series) and Dense Keras layers.

If I try to map it via the LSTM layer, the input shape will seek timesteps and features as parameters and a 3D array. As per use cases I observed of LSTM, it means that I convert this data like, create a list of list of "first 50 or XYZ timesteps of 35 feature" followed by +1 timesteps array. This is like providing 50 or XYZ days data to get the result of the 50th day and then repeating the same for 1-51 days, 2-52 days, 3-53 days, and so on for the whole data.

model.add(LSTM(25, return_sequences=True, input_shape = (50,35)))
model.add(LSTM(25, return_sequences=False))
model.add(Dense(10))
model.add(Dense(1))

Is this the best way to find relationships among time-series data or using simple layers like Dense will work or are there some better options?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source