'Why are we doing this in linear 'self.n_hidden*train' and how is LSTM giving me 100 predictions?

time series prediction of sine wave using LSTM

I am confused with the model and how its giving 100 predictions, when LSTM gives the next step output.

Also I cannot figure out why we are doing (self.n_hidden*train_len) in a Linear layer

Here is my class:

class RNN(nn.Module): # time series prediction RNN class
    def __init__(self, train_len=100, n_hidden=60, pre_len=100):  'Initialize'
        super(RNN, self).__init__()
        self.n_hidden = n_hidden               'hiddenlayer'
        self.lstm1 = nn.LSTM(1, self.n_hidden, 1)  # (inputsize=1,hiddensize=60,layer=1)
        self.linear = nn.Linear(self.n_hidden*train_len, pre_len)  # (6000,1)
        self.pre_len = pre_len
        self.train_len = train_len

    def forward(self, x_train):

        h, c = self.lstm1(x_train)  #x_train=tensor of size[1,100,1]   ## h=tensor{1,100,60}

        output = self.linear(h.view(-1, self.n_hidden*self.train_len))
        # print(output.shape)           
        return output            #tensor{1,100}


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source