'Pytorch DataLoader prints the same features

Edit: Issue has been resolved

I created a custom Dataset in Pytorch, where the feature is an array with four elements and the label is a single number. For some reason, when I try to create a Dataloader object, it is storing the same features for every batch but the labels are different. Here is my code:

class CustomDataset(Dataset):
    def __init__(self, features, labels):
        self.features = features
        self.labels = labels

    def __len__(self):
        return len(self.labels)

    def __getitem__(self, idx):
        label = self.labels[idx]  # .loc[idx, 'Phase shift']
        feature = self.features[i][0], self.features[i][1], self.features[i][2], self.features[i][3]
        return feature, label  # datapoint

scatterDataset = CustomDataset(xTrain, yTrain)
train_dataloader = DataLoader(scatterDataset, batch_size=2, shuffle=False)
for x, y in train_dataloader:
    print(x, y)

This is the output when I print one entry of the custom Dataset:

((90, 1.0, 5.6, 345.61), 38.99157642769459)

Below is the output I am getting:

[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([55.3332, 20.7054], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([ 36.6615, -81.3742], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([51.2065, 18.2320], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([178.8603, -41.5669], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([ 8.3370, 32.2717], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([35.4542, 58.8170], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([71.5183, 88.5670], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([ 35.0192, -73.0656], dtype=torch.float64)
[tensor([90, 90]), tensor([1., 1.], dtype=torch.float64), tensor([5.6000, 5.6000], dtype=torch.float64), tensor([345.6100, 345.6100], dtype=torch.float64)] tensor([-57.7716, -17.6789], dtype=torch.float64)

I've scoured the web and I can't seem to find any solutions. I'm not sure what is going on here, and I'm very new to Pytorch so I don't know if it's something in the way I wrote my code. Thanks in advance.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source