'Is this softmax activation function using torch.exp() correct?

I am trying to develop a function for softmax activation. The function should do torch.Tensor, 2D matrix with sum over rows is 1.

Is this function correct?

def softmax(x):
    return torch.exp(x)/torch.sum(torch.exp(x), dim=1).view(-1,1)


Solution 1:[1]

This is calculating the sum over columns because of using dim=1

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 A.Abdelhameed