Matlab vs Mathematica, eigenvectors?
function H = calcHyperlinkMatrix(M)
[r c] = size(M);
H = zeros(r,c);
for i=1:r,
for j=1:c,
if (M(j,i) == 1)
colsum = sum(M,2);
H(i,j) = 1 / colsum(j);
end;
end;
end;
H
function V = pageRank(M)
[V D] = eigs(M,1);
V
function R = google(links)
R = pageRank(calcHyperlinkMatrix(links));
R
M=[[0 1 1 0 0 0 0 0];[0 0 0 1 0 0 0 0];[0 1 0 0 1 0 0 0];[0 1 0 0 1 1 0 0];
[0 0 0 0 0 1 1 1];[0 0 0 0 0 0 0 1];[1 0 0 0 1 0 0 1];[0 0 0 0 0 1 1 0];]
google(M)
ans =
-0.1400
-0.1576
-0.0700
-0.1576
-0.2276
-0.4727
-0.4201
-0.6886
Mathematica:
calculateHyperlinkMatrix[linkMatrix_] := {
{r, c} = Dimensions[linkMatrix];
H = Table[0, {a, 1, r}, {b, 1, c}];
For[i = 1, i < r + 1, i++,
For[j = 1, j < c + 1, j++,
If[linkMatrix[[j, i]] == 1, H[[i, j]] = 1/Total[linkMatrix[[j]]],
0]
]
];
H
}
H = {{0, 0, 0, 0, 0, 0, 1/3, 0}, {1/2, 0, 1/2, 1/3, 0, 0, 0, 0}, {1/2,
0, 0, 0, 0, 0, 0, 0}, {0, 1, 0, 0, 0, 0, 0, 0}, {0, 0, 1/2, 1/3,
0, 0, 1/3, 0}, {0, 0, 0, 1/3, 1/3, 0, 0, 1/2}, {0, 0, 0, 0, 1/3,
0, 0, 1/2}, {0, 0, 0, 0, 1/3, 1, 1/3, 0}};
R = Eigensystem[H];
VR = {R[[1, 1]], R[[2, 1]]}
PageRank = VR[[2]]
{1, {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59, 1}}
Matlab and Mathematica doesn't give the same eigenvector with the eigenvalue 1. Both works though...which one is correct and why are they different? How do I gte all eigenvectors with the 开发者_如何学JAVAeigenvalue 1?
The Definition of an Eigenvector X
is some vector X
that satisfies
AX = kX
where A
is a matrix and k
is a constant. It is pretty clear from the definition that cX
is also an Eigenvector for any c
not equal to 0
. So there is some constant c
such that X_matlab = cX_mathematica
.
It looks like the first is normal (has Euclidean length 1, i.e. add the sums of the squares of the coordinates then take the square root and you will get 1) and the second is normalised so that the final coordinate is 1 (any Eigenvector was found and then all coordinates were divided by the final coordinate).
You can use whichever one you want, if all you need is an Eigenvector.
This is because if a vector x is an eigenvector of matrix H, so is any multiple of the x. Vector you quote as an answer for matlab does not quite check:
In[41]:= H.matlab - matlab
Out[41]= {-0.0000333333, 0.0000666667, 0., 0., 0.0000333333, 0., \
-0.0000666667, 0.}
But assuming it is close enough, you see that
In[43]:= {12/59, 27/118, 6/59, 27/118, 39/118, 81/118, 36/59,
1}/{-0.1400, -0.1576, -0.0700, -0.1576,
-0.2276, -0.4727, -0.4201, -0.6886}
Out[43]= {-1.45278, -1.45186, -1.45278, -1.45186, -1.45215, -1.45217, \
-1.45244, -1.45222}
consists of almost the same elements. Thus matlab's vector is -1.45 multiple of Mathematica's.
Eigenvectors are not necessarily unique. All that is required of an eigenvector is that
- It must have unit norm
v_m*v_n=0
for allm ≠ n
(orthogonality)- It satisfies
Av_m=u_m v_m
, whereu_m
is the corresponding eigenvalue
The exact eigenvectors returned depends on the algorithm implemented. As a simple example to demonstrate that one matrix can have two different sets of eigenvectors, consider an NxN
identity matrix:
I= 1 0 0 0
0 1 0 0
... ... ... ...
0 0 0 1
It is obvious (and can be easily confirmed) that each column of I
is an eigenvector and the eigenvalues are all 1.
I now state that the following vectors
v_m=[1,exp(2*pi*1i*m/N),...,exp(2*pi*1i*m*(N-1)/N)]';
for m=1,2...,N
form an orthogonal basis set with norm 1, and hence are the eigenvectors of I
. Here 1i
refers to square root of -1
in MATLAB notation. You can verify this for yourself:
N=50;
v=1/sqrt(N)*cumprod(repmat(exp(-1i*2*pi/N*(0:N-1)),N,1),1);
imagesc(real(v*v'));
Here I've taken the real part because the imaginary part is non-zero (of order 10^-16
)due to machine precision effects, but should be zero (you can even do this analytically and it should be zero). imagesc
returns an error otherwise.
So, to sum up, eigenvectors are not necessarily unique and both convey the same information; just in different representations.
精彩评论