'np.linalg.inv not raising error for singular matrix

I have a simple 2x2 matrix:

import numpy as np

A = np.array([
    [1, 2],
    [5, 10]
])

Which is singular as the second column is a multiple of the first, and the determinant is zero (1 * 10 - 5 * 2). I expect to get an error when I run np.linalg.inv on A, but instead the inverse is computed without any problem.

>>>np.linalg.inv(A)
array([[ 1.80143985e+16, -3.60287970e+15],
       [-9.00719925e+15,  1.80143985e+15]])

Furthermore, the determinant is also supposed to be zero, but shows up as:

>>>np.linalg.det(A)
5.551115123125802e-16

Which is of course, a very small number near zero. Regardless, A is a singular matrix, but it doesn't seem to raise the Singular matrix error.

There is no problem with a matrix like:

B = np.array([
    [9, 6],
    [12, 8]
])

This is also singular, as det = (9 * 8 - 12 * 6) = 0. Here it is working as expected.

>>>np.linalg.inv(B)
LinAlgError: Singular matrix

>>>np.linalg.det(B)
0.0

What am I missing?


Update: Most likely a machine or version problem. When I ran the same code on Google Colab it raised the error as expected. Afterwards, reinstalling numpy on a clean virtualenv solved the issue.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source