'Pytorch: Difference in type between Input type and Weight type with ModuleList

I have a problem with a list of modules. I have coded a module called BranchRoutingModule. I would like to create a list from this module. I have the following code:

def _branch_routings(self):
# structure = [nn.ModuleList([BranchRoutingModule(in_channels=self.in_channels) for j in range(int(pow(2, i)))]) for i in range(self.tree_height - 1)] # [[None], [None, None]] for tree height = 3
structure = [[None for j in range(int(pow(2, i)))] for i in range(self.tree_height - 1)] # [[None], [None, None]] for tree height = 3

cur = 0
for i in range(self.tree_height - 1):
  for j in range(int(pow(2, i))):
    self.__setattr__('branch_routing_module' + str(cur), BranchRoutingModule(in_channels=self.in_channels))
    structure[i][j] = self.__getattr__('branch_routing_module' + str(cur))
    cur += 1
return structure

I have first tried using nn.ModuleList (commented out at the top) but I get the following error: “Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same”

However if I use setattr and getattr, I get no errors and my model works fine.

Why is that? I don’t understand why setattr and getattr fix the problem. I am using CUDA.

Thank you and regards, Antoine



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source