Setting Matrix Values based on 'symsum' formula
1 view (last 30 days)
I am working on a Doolittle LU decomposition at the moment, which is useful for solving a system of linear algebraic equations.
3x-0.1y-0.2z = 7.85
0.1x+7y-0.3z = -19.3
0.3x-0.2y+10z = 71.4
A = [3 -0.1 -0.2; 0.1 7 -0.3; 0.3 -0.2 10]
B = [7.85; -19.3;71.4]
We can decompose matrix A into L and U matrices.
We can use the following general formulae to achieve this quickly for any NxN matrix (square matrix):
I've tried implementing this into a sample code as follows:
a = [3 -0.1 -0.2; 0.1 7 -0.3; 0.3 -0.2 10]
b = [7.85; -19.3;71.4]
U = zeros(height(a))
L = eye(height(a))
L(i,j) = a(i,j) - symsum(L(i,r)*U(r,j),r,1,j-1)
%I have only included the L matrix for simplicity's sake
However, this outputs the following error:
Unrecognized function or variable 'r'.
Changing r to:
also yields the following error:
Error using sym/subsindex (line 857)
Invalid indexing or function definition. Indexing must follow MATLAB indexing. Function arguments must be symbolic
variables, and function body must be sym expression.
I do not understand the issue. Could someone please explain the error or suggest a possible solution?
Just in case: documentation for symsum
Thank you in advance for any help.
Swetha Polemoni on 15 Apr 2021
Hi Michael Jacobson
You cannot use symbolic variable as index to a matrix. However the following code snippet can help to get summation you want.
Hope this helps