'How would you stagger input from multiple times series into a keras neural network?
I'll start with an example with three time periods.
| time | feature0 | feature1 | label |
|---|---|---|---|
| 0 | a | b | 3 |
| 1 | c | d | 4 |
| 2 | e | f | 5 |
There would be many datasets like this.
The neural network would have less than t+1 inputs; in this example two inputs.
The first iteration would take input [[a, b], [z, z]] and try to predict 3. z is used as an empty value here. For t=1, the input would be [[c, d], [a, b]] and try to predict 4. For t=2, the input would be [[e, f], [c, d]] and try to predict 5. Then this process would be done on the rest of the datasets.
The closest thing I found to explaining this is this link. https://www.tensorflow.org/tutorials/structured_data/time_series
Following the example in the link, I would like to feed data from multiple regions rather than just one.
I would like to do this with a plain densely connected neural network.
How is this possible?
It looks like I can use tf.data.experimental.make_csv_dataset and Google's WindowGenerator code to use all the datasets at once.
Edit: I can not figure out how to load all the csvs into memory (They will fit). I also cannot figure out how to turn the tf.data.experimental.make_csv_dataset dataset values into dataframes for the keras framework.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
