Find the common eigenvectors and eigenvalues between 2 matrices

43 views (last 30 days)
Hello,
I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A and B such as :
AX=aX with "a" the diagonal matrix corresponding to the eigenvalues
BX=bX with "b" the diagonal matrix corresponding to the eigenvalues
I took a look in a similar post topic https://stackoverflow.com/questions/56584609/finding-the-common-eigenvectors-of-two-matrices but had not managed to conclude, i.e having valid results when I build the final wanted endomorphism F defined by : F = P D P^-1
I have also read the wikipedia topic on Wikipedia and this interesting paper https://core.ac.uk/download/pdf/82814554.pdf but couldn't have to extract methods pretty easy to implement.
Particularly, I am interested by the
eig(A,B)
Matlab function.
I tried to use it like this :
% Search for common build eigen vectors between FISH_sp and FISH_xc
[V,D] = eig(FISH_sp,FISH_xc);
% Diagonalize the matrix (B^-1 A) to compute Lambda since we have AX=Lambda B X
[eigenv, eigen_final] = eig(inv(FISH_xc)*FISH_sp);
% Compute the final endomorphism : F = P D P^-1
FISH_final = V*eye(7).*eigen_final*inv(V)
But the matrix `FISH_final` don't give good results since I can do other computations from this matrix FISH_final (this is actually a Fisher matrix) and the results of these computations are not valid.
So surely, I must have done an error in my code snippet above.
If someone could help me to build these common eigenvectors and finding also the eigenvalues associated, this would be fine to tell it, I am a little lost between all the potential methods that exist to carry it out.
  2 Comments
David Goodmanson
David Goodmanson on 4 Jan 2021
Hi petit,
It appears that A and B known matrices and are you are looking for one or more X that are an eigenvector for both A and B. Be aware that
AX=aX BX=bX
is not at all the same as the eig(A,B) problem (shown here for a single eigenvalue lambda)
A*X = B*X*lambda
In the second case, for nonsingular B the problem reduces to the standard eigenvalue problem
(inv(B)*A)*X = X*lambda
with one set of eigenvalues, not two.
petit
petit on 4 Jan 2021
Edited: petit on 4 Jan 2021
Hi David,
So how could I find a common eigen vectors ? I didn't find a lot of documentation on the web to carry out this kind of problem.
but I have difficulties to build the cmmon eigen vectors. Below the code whose method is described with FISH_sp and FISH_sp the 2 starting matrices :
% Search for common build eigen vectors between FISH_sp and FISH_xc
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
% Check espilon
for i=1:7
tol=sum(abs(FISH_sp*V1(:,i)-D1(i)*V1(:,i)));
tol
tol=sum(abs(FISH_xc*V2(:,i)-D2(i)*V2(:,i)));
tol
end
% Sorting
[d1,I1]=sort(diag(D1))
[d2,I2]=sort(diag(D2))
% Identify repeated elemnts
[~,ia,~]=unique(d1,'stable')
[~,ib,~]=unique(d2,'stable')
% Find a same space
W1 = V1(:,I1)
W2 = V2(:,I2)
% Loop for rref
m = zeros(numel(ia,ib));
m = zeros(numel(ia,ib));
numel(ia)
numel(ib)
for i=1:numel(ia)
for j=1:numel(ib)
check_linear_dependency(col1,col2);
[R,p] = rref(W1(:,ia(i):ia(i+1)-1), W2(:,ib(j):ib(j+1)-1));
end
end
%%%%%%%%%%%%%% I don't know after loop what to do to build common eigenvectors %%%%%%%%%%%%%
As you can see, I don't know the next step to do in order to build the matrix of common eigenvectors.
If someone could help me this would be fine.
Regards

Sign in to comment.

Accepted Answer

David Goodmanson
David Goodmanson on 5 Jan 2021
Edited: David Goodmanson on 5 Jan 2021
Hi petit,
Eigenvectors calculated by Matlab are normalized, but neither (a) the the overall phase of each one or (b) the order ot the eigenvalues and the corresponding columns of the eigenvectors are guaranted to be anything in particular. But if AX = aX and BX = bX, then for
[vA lambdaA] = eig(A)
[vB lambdaB] = eig(B)
there will be a case where a column of Va and a column vB differ by only a phase factor. So in the matrix product vA'*vB there be an entry of absolute value 1, the phase factor.
n = 6;
% set up matrics A and B with two eigenvectors in common
v = 2*rand(n,n)-1 +i*(2*rand(n,n)-1); % eigenvalues
w = 2*rand(n,n)-1 +i*(2*rand(n,n)-1);
lamv = rand(n,1); % eigenvectors
lamw = rand(n,1);
% two common eigenvectors
w(:,2) = v(:,3);
w(:,4) = v(:,5);
A = (v*diag(lamv))/v;
B = (w*diag(lamw))/w;
% given A and B, find the common eigenvectors
[vA lamA] = eig(A);
[vB lamB] = eig(B);
vAvB = vA'*vB;
[j k] = find(abs(abs(vAvB)-1)<1e-12)
% show that the jth A eigenvector and kth B eigenvector are proportional
vA(:,j(1))./vB(:,k(1))
vA(:,j(2))./vB(:,k(2))
This method works as long as for the eigenvectors in question, the eigenvalues are distinct. If there are repeated eigenvalues, then if A has eigenvectors x and y, B might have eigenvectors that are linear combinartions of x and y. Then the job gets a lot harder.
You can also use the fact that
(A-B)X = (a-b)X
to look for cases where an eigenvalue of (A-B) equals (a-b), where a is one of the eigenvalues of A and B is one of the eigenvalues of B. However, this method is likely to be more prone to false positives than is the first method.
  8 Comments
David Goodmanson
David Goodmanson on 7 Jan 2021
For
[vA lamA] = eig(A)
[vB lamB] = eig(B)
then the columns of vA are the eigenvectors (evecs). Same for vB. You want to find if any column of A is the same as some column of B. All the evecs are normalized to 1 but unfortunately, an evec can be multiplied by a phase factor exp(i*theta) (which includes the phase factor -1) and still be an evec. No guarantee what eig produces, so an evec of A and an evec of B can differ by a phase factor. So you have a couple of choices.
[1] compare each evec eA of A with each evec eB of B, (n^2 cases), determine the phase factor between them, take out the phase factor, look at the difference eA-eB and decide if it's small enough. But unless both eA and eB are real, the phase factor issue is harder than it looks.
[2]
Use the fact that if eA and eB are almost equal, then since they are normalized their dot product eA'*eB will close to 1. Here eA' turns column vector to row vector; and row vector times column vector eB is the scalar dot product. So you need the transpose. Multiplying the matrix vA' by the matrix vB automatically finds all n^2 possible dot products of a column of A with a column of B and you can search the resulting matrix for values near 1.
As far as tolererances, you have to decide for yourself what is appropriate. In this case 1e-2 gives 1 equality, and as you point out, 1e-1 gives 4 equalites. If you use 1e-12 then nothing is equal to anything, and if you use 1e1 then everything equals everything else. Maybe your data is imprecise enough that there should not be any equalities. It's a judgment call.
I'm not sure what to make of null(FISH_sp*FISH_xc-FISH_xc*FISH_sp) having a nonempty null space.
Bruno Luong
Bruno Luong on 7 Jan 2021
Edited: Bruno Luong on 7 Jan 2021
null(FISH_sp*FISH_xc-FISH_xc*FISH_sp)
returns the basis of subspace where the two linear operator commutes when restricted on the subspace.
As example I(n) and 2*I(n) (or any matrix)
commutes, the above NULL returns n vectors, yet they the operators not share any eigen vector.

Sign in to comment.

More Answers (2)

Bruno Luong
Bruno Luong on 4 Jan 2021
Edited: Bruno Luong on 4 Jan 2021
K = null(A-B);
[W,D] = eig(K'*A*K);
X = K*W, % common eigen vectors
lambda = diag(D), % common vector
  13 Comments
Bruno Luong
Bruno Luong on 4 Jan 2021
Edited: Bruno Luong on 4 Jan 2021
Quote: " I just want Common eigen vectors" meaning
AX = BX
This is equivalent to
(A-B)*X = 0
Therefore
X belongs to span of NULL(A-B)
If NULL returns empty result then there is NO common eigen vector.
This also implies eigen values are common, this is a consequence of YOUR request, not because I add it as an extra requirement. If you don't understand it you do not understand the math logic.
Might be you redefine the meaning of word "common"? If you do then right I don't understand what you want.
petit
petit on 4 Jan 2021
Edited: petit on 4 Jan 2021
As you will see, I am looking the expression of D' diagonal matrix as a function of D1 and D2 that come from eig(FISH_sp) and eig(FISH_xc).
This way, I could get a new Fisher matrix F = P D' P^-1 with P the "common" basis of eigenvectors and D' the diagonal matrix which depends of eigenvalue of eig(FISH_sp) and eig(FISH_xc).
Particulary, I mention the fact that if I want to make the combination of these 2 Fisher informations (called also "cross-correlation), I can't build a "combinated" Fisher matrix directly by summing the 2 diagonal matrices since the linear combination of random variables is different between the 2 Fisher matrices after diagonalisation.
That's why I want to find this "P" common matrix, depending of P1 and P2 (respectively from eigen vectors of FISH_sp and FISH_xc).
It may be tricky to understand but there is also something which is logic in my reasoning, even if I have difficulties to express the D' diagonal matrix as a function of D1 and D2.

Sign in to comment.


petit
petit on 23 Jan 2021
Edited: petit on 23 Jan 2021
Hi Bruno !
Finally, I have just found a unique common eigenvector coming from :
null(A*B-B*A)
and which is equal to :
ans =
-0.0085
-0.0048
-0.2098
0.9776
-0.0089
-0.0026
0.0109
How could I exploit this single common eigen vector to build an approximative (everything is relative, that's why I would like to set a tolerance factor) common basis between A and B ?
I thought about a Gram-Schmidt process ot build this common basis but I am sure this is correct.
Any suggestion/clue/help is welcome.

Categories

Find more on Linear Algebra in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!