Setting Matrix Values based on 'symsum' formula

1 view (last 30 days)
I am working on a Doolittle LU decomposition at the moment, which is useful for solving a system of linear algebraic equations.
For example:
3x-0.1y-0.2z = 7.85
0.1x+7y-0.3z = -19.3
0.3x-0.2y+10z = 71.4
A = [3 -0.1 -0.2; 0.1 7 -0.3; 0.3 -0.2 10]
B = [7.85; -19.3;71.4]
We can decompose matrix A into L and U matrices.
We can use the following general formulae to achieve this quickly for any NxN matrix (square matrix):
I've tried implementing this into a sample code as follows:
a = [3 -0.1 -0.2; 0.1 7 -0.3; 0.3 -0.2 10]
b = [7.85; -19.3;71.4]
U = zeros(height(a))
L = eye(height(a))
i=0;
j=0;
while(1)
i=i+1;
j=j+1;
L(i,j) = a(i,j) - symsum(L(i,r)*U(r,j),r,1,j-1)
%I have only included the L matrix for simplicity's sake
if i==height(a)
break,end
end
However, this outputs the following error:
Unrecognized function or variable 'r'.
Changing r to:
syms r
also yields the following error:
Error using sym/subsindex (line 857)
Invalid indexing or function definition. Indexing must follow MATLAB indexing. Function arguments must be symbolic
variables, and function body must be sym expression.
I do not understand the issue. Could someone please explain the error or suggest a possible solution?
Thank you in advance for any help.

Answers (1)

Swetha Polemoni
Swetha Polemoni on 15 Apr 2021
Hi Michael Jacobson
You cannot use symbolic variable as index to a matrix. However the following code snippet can help to get summation you want.
L=rand(10,10);
U=rand(10,10);
for k=0:9
i=k+1;
j=k+1;
r=1:i;
summation(i)= L(i,r)*U(r,j);
end
Hope this helps

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!