probability of Markov process

2 views (last 30 days)
Brave A
Brave A on 25 Sep 2019
Commented: Brave A on 26 Sep 2019
I have this tansition matrix which is a propability of Markov processpropability of Markov process
P= [ .4 .0 .0 .1 .0 .7 .3 .2 .3 .2 .3 .4 .3 .1 .4 .3 ] and the initial distribution is x (1) = [.1 .1 .5 .3]T
I would like to compute x where x = Px.
Here is my attempt but I am not getting the correct result .
Thanks in advance!
M = [.4 .0 .3 .3 ;.0 .7 .2 .1; .0 .3 .3 .4 ;.1 .2 .4 .3];
X=[.1 .1 .5 .3];
B = X.';
eig(M');
% % ans =
%
% 1.0000
% 0.5357
% 0.2685
% -0.1043
[V,D] = eig(M');
P = V(:,1)';
P = P./sum(P);
% P =
% 0.0400 0.4400 0.2800 0.2400
% P*M
% ans =
% .0400 0.4400 0.2800 0.2400
  6 Comments
Brave A
Brave A on 26 Sep 2019
Thanks for your help!
sorry I didnot mentioed that
P = [ .4 .0 .0 .1 .0 .7 .3 .2 .3 .2 .3 .4 .3 .1 .4 .3 ]
and I need to find X .

Sign in to comment.

Answers (0)

Categories

Find more on Argument Definitions in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!