'LU decomposition of a matrix. U matrix has 1's on the diagonals
I am writing code for decomposing a matrix A into the form L U. However, the U Matrix which I get has 1's all on the diagonal. My findings reveal that the variable summ1 is always 0 and that's why the U matrix has 1's on the diagonal. The algorithm seems okay after cross-checking with the mathematical part. Could someone advise what could be a fix for this? ( I am assuming Lii = Aii )
Output: A =
[1, 2, 3]
[4, 5, 6]
[7, 8, 9]
L =
[1, 0, 0]
[4.0, 5, 0]
[7.0, 8.0, 9]
U =
[1.0, 2.0, 3.0]
[0, 1.0, 1.2]
[0, 0, 1.0]
class Matrix:
def __init__(self,row=0,col=0, data=[]):
assert row * col == len(data) , 'Dimensions do not match'
self.row = row
self.col = col
self.data = data
def __repr__(self):
self.data_modified = [ [self.data[i*self.col + j] for j in range(self.col) ] for in range(self.row) ]
for i in self.data_modified:
print(i)
return ""
def __mul__(self,other):
assert self.col == other.row , 'Dimensions not suitable for Matrix Multiplication'
list3 = []
for i in range(self.row):
for j in range(other.col):
summ = 0
for k in range(0,self.col):
summ += (self.data[i*self.col + k]*other.data[k * other.col + j])
list3.append(summ)
A = [1,2,3,4,5,6,7,8,9]
A_row = 3
A_col = 3
L = [ 0 for i in range(A_row * A_col)]
U = [ 0 for i in range(A_row * A_col)]
for i in range(A_row):
L[i*A_col + i] = A[i*A_col + i]
for i in range(A_row):
summ1 = 0
summ2 = 0
for j in range(i,A_col):
for k in range(0,i-1):
summ1 += L[i*A_col+k]*U[k*A_col+j]
U[i*A_col + j] = ( A[i*A_col + j] - summ1 )/L[i*A_col+i]
for j in range(i):
for k in range(0,j-1):
summ2 += L[i*A_col+k]*U[k*A_col+j]
L[i*A_col + j] = ( A[i*A_col + j] - summ2 )/U[j*A_col+j]
A1 = Matrix(3,3,A)
print("A=")
print(A1)
L1 = Matrix(3,3,L)
print("L =")
print(L1)
U1= Matrix(3,3,U)
print("U =")
print(U1)
print(L1*U1)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
