'What is the time-complexity of the pseudo-inverse in pytorch (i.e. torch.pinverse)?

Let's say I have a matrix X with n, m == X.shape in PyTorch. What is the time complexity of calculating the pseudo-inverse with torch.pinverse?

In other words, what is the time complexity of

X_p = torch.pinverse(X)

?

Here is the documentation



Solution 1:[1]

The PyTorch documentation states that pinverse is calculated using SVD (singular value decomposition). The complexity of SVD is O(n m^2), where m is the larger dimension of the matrix and n the smaller. Thus this is the complexity.

For more info, check out these pages on wikipedia:

Solution 2:[2]

As the privious answer correctly mentioned, it is O(n m^2), but n is the larger dimension, not m (here is where the previous answer goes wrong).

TLDR: it is linear in the larger dimension and quadratic in the smaller dimension.

More details:

A simple code like this shows that the complexity is linear with respect to the larger dimension:

import time
import torch

n = 200
results = []
for m in trange(300, 50000, 500):
    a = torch.rand(n,m)
    start_time = time.time()
    b = torch.pinverse(a)
    duration = time.time() - start_time
    results.append((m, duration))

res = list(zip(*results))
plt.plot(res[0], res[1])

enter image description here

While for values smaller than n we have:

n = 1100
results = []
for m in trange(50, n, 20):
    a = torch.rand(n,m)
    start_time = time.time()
    b = torch.pinverse(a)
    duration = time.time() - start_time
    results.append((m, duration))

enter image description here

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 dennlinger
Solution 2 AliVar