Need help in debugging this code I found which would help in understanding how the function call works in matlab.

4 views (last 30 days)
I am working on Markov chain calculations and I came across this code in one of the questions asked on this website, which calculates the variables such as mean first passage time and co variance etc. I did however come across some errors such as
Line 27 The value assigned here to 'states' appears to be unused. Consider replacing it by ~.
Line 30 Use of brackets [] is unnecessary. Use parentheses to group, if needed.
Line 30 Instead of using transpose ('), consider using a different DIMENSION input argument to SUM.
Line 33 To improve performance, replace ISEMPTY(FIND(X)) with ISEMPTY(FIND( X, 1 )).
Line 64 INV(A)*b can be slower and less accurate than A\b. Consider using A\b for INV(A)*b or b/A for b*INV(A).
Line 64 INV(A)*b can be slower and less accurate than A\b. Consider using A\b for INV(A)*b or b/A for b*INV(A).
Line 73 Invalid syntax at end of line. Possibly, a ), }, or ] is missing.
Line 75 Invalid syntax at end of line. Possibly, a ), }, or ] is missing.
I am planning to use a transition matrix as input and then use this M file to return the values of variables. appreciate any help regarding fixing the code. This would help me in understanding how the functions works and I would be able to formulate my own function for Markov chain calculation using this code as a reference.
Thanks.
The code is as follows:
1 function [alpha,beta,Z,M,M2,C,CORR,D1P,PR]=ergodic(P)
2 % format: function [alpha,beta,Z,M,M2,C,CORR,D1P,PR]=ergodic(P)
3 % or alpha=ergodic(P)
4 % Solution of regular ergodic Markov chains using equations from
5 % chapter 4 in Kemeny & Snell (1976), chapter 5 in Roberts (1976).
6 % written by E. Gallagher, ECOS
7 % UMASS/Boston Email: Eugene.Gallagher@umb.edu
8 % written 9/26/93, last revised: 10/23/94
9 % refs:
10 % Kemeny, J. G. and J. L. Snell. 1976. Finite Markov Chains.
11 % Springer Verlag, New York.
12 % Roberts, F. 1976. Discrete Mathematical Models. Prentice Hall.
13 % input: P a regular ergodic Markov transition matrix in 'from rows
14 % to columns' form; row sum=1
15 % output: alpha = Stable limit vector (K & S p. 70)=fixed point
16 % probability vector (Roberts p. 291)
17 % beta = limiting variance for stable limit vector (K & S p. 91);
18 % Z = The fundamental matrix (K & S p. 78; Roberts p. 294)
19 % M = Mean 1st passage time matrix (K & S p. 78;Roberts' E, p.295)
20 % M2 = Covariance matrix for first passage times (K & S, p.82-84)
21 % C = cij is the limiting covariance for the number of times
22 % in states i and j in the first n steps (K & S p. 88)
23 % CORR = limiting correlations based on C, (K&S,p. 88, 92)
24 % D1P = inv(D)*P= The exchange matrix (K & S p. 193-194, 200);
25 % fractional exchange from the ith to jth class at equilibrium
26 % PR = reverse Markov chain (not all chains reversible);
27 [states,states]=size(P);
28 I=eye(states,states);
29 E=ones(states,states);
30 if sum([sum(P')-ones(1,states)].^2)>0.001
31 error('Rows of P must sum to 1.0')
32 end
33 absorbstates=find(diag(P)==1);
34 if ~isempty(absorbstates)
35 error('There is at least one absorbing state, use absorb.m to solve')
36 end
37 % P defines a regular, ergodic Markov chain
38 % The stable limit vector, alpha, is the eigenvector associated with
39 % the dominant eigenvalue in a regular ergodic chain (=1 by def). This
40 % eigenvector must be scaled such that the sum of elements=1.
41 % Solve for the eigenvector and scale simultaneously using the
42 % characteristic equation:
43 % The characteristic equation for a matrix:
44 % (lambda*I-P') * alpha = 0
45 % PT * alpha = B
46 % where lambda=eigenvalue
47 % If lambda=1, alpha=the Markovian stable limit vector.
48 PT=I-P'; % The key term of characteristic equation (lambda=1)
49 % Replace final row of PT with a row of ones to scale dominant eigenvector
50 PT(states,:)=ones(1,states);
51 % Replace final element of zero vector, B, with 1, to scale alpha to 1.0:
52 B=[zeros(states-1,1);1];
53 % solve for n unknowns with n simultaneous linear equations
54 alpha=(PT\B)'; % X=A\B is MATLAB solution to AX=B
55 % convert alpha to row vector
56 if nargout>1
57 A=alpha(ones(1,states),:); %MATLAB rowcopy trick
58 Z=inv(I-P+A);
59 Zdg=diag(diag(Z));
60 beta=(alpha.*(2*diag(Z)'-ones(1,states)-alpha));
61 % note: beta equals diag(C)' below;
62 D=diag(ones(1,states)./alpha);
63 M=(I-Z+E*Zdg)*D;
64 W=M*(2*Zdg*D-I)+2*(Z*M-E*diag(diag(Z*M)));
65 M2=W-M.^2;
66 C=A'.*Z+A.*Z'-A'.*I-A'.*A;
67 CORR=C./sqrt(diag(C)*diag(C)');
68 if nargout>7
69 D1P=D\P;
70 if nargout>8
71 if sum(sum((D1P-D1P').^2))>1e-6
72 disp(...'Markov chain not reversible since inv(D)*P isn''t symmetric')
73 disp('See Kemeny & Snell Theorem 5.3.3.')
74 disp(...'Delete last output argument in call to ergodic.m and redo.')
75 disp('Hit any key to continue'),pause
76 return
77 end
78 PR=(D*P')/D;
79 end
80 end
81 end

Accepted Answer

Matt J
Matt J on 24 Apr 2016
Edited: Matt J on 24 Apr 2016
These are all Code Analyzer warnings. I don't think the first 4 should throw error messages. You could of course follow the suggestions for improvement that the code analyzer has given you. It should even offer to fix those lines for you if you are using the MATLAB editor to detect them (float the mouse over the red underlining or the hash marks in the right margin of the editor).
The Invalid Syntax warnings are more serious, but I don't see anything invalid in lines 73,75. Possibly the code has been changed since you applied the code analyzer.
  2 Comments
Steven Lord
Steven Lord on 25 Apr 2016
The problems are not actually on lines 73 and 75 but the uses of ... on lines 72 and 74. If you have ... on a line NOT inside single quotes then those tell MATLAB that everything following them on that line is a comment and the remainder of the command is on the next line. Eliminate them, since the remainder of lines 72 and 74 are in fact part of the commands starting on those lines.

Sign in to comment.

More Answers (0)

Tags

No tags entered yet.

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!