开发者

Eigenvector computation using OpenCV

I have this matrix A, representing similarities of pixel intensities of an image. For example: Consider a 10 x 10 image. Matrix A in this case would be of dimension 100 x 100, and element A(i,j) would have a value in the range 0 to 1, representing the similarity of pixel i to j in terms of intensity.

I am using OpenCV for image processing and the development environment is C on Linux.

Objective is to compute the Eigenvectors of matrix A and I have used the following approach:

static CvMat mat, *eigenVec, *eigenVal;
static double A[100][100]={}, Ain1D[10000]={};
int cnt=0;

//Converting matrix A into a one dimensional array
//Reason: That is how cvMat requires it
for(i = 0;i < affnDim;i++){
  for(j = 0;j < affnDim;j++){
 Ain1D[cnt++] = A[i][j];
  }
}

mat = cvMat(100, 100, CV_32FC1, Ain1D); 

cvEigenVV(&mat, eigenVec, eigenVal, 1e-300);

for(i=0;i < 100;i++){
  val1 = cvmGet(eigenVal,i,0); //Fetching Eigen Value

  for(j=0;j < 100;j++){   
 matX[i][j] = cvmGet(eigenVec,i,j); //Fetching each component of Eigenvector i    
  }
}

Problem: After execution I get nearly all components of all the Eigenvectors to be zero. I tried different images and also tried populating 开发者_开发百科A with random values between 0 and 1, but the same result.

Few of the top eigenvalues returned look like the following:

9805401476911479666115491135488.000000  
-9805401476911479666115491135488.000000  
-89222871725331592641813413888.000000  
89222862280598626902522986496.000000  
5255391142666987110400.000000

I am now thinking on the lines of using cvSVD() which performs singular value decomposition of real floating-point matrix and might yield me the eigenvectors. But before that I thought of asking it here. Is there anything absurd in my current approach? Am I using the right API i.e. cvEigenVV() for the right input matrix (my matrix A is a floating point matrix)?

cheers


Note to readers: This post at first may seem unrelated to the topic, but please refer to the discussion in the comments above.

The following is my attempt at implementing the Spectral Clustering algorithm applied to image pixels in MATLAB. I followed exactly the paper mentioned by @Andriyev:

Andrew Ng, Michael Jordan, and Yair Weiss (2002). On spectral clustering: analysis and an algorithm. In T. Dietterich, S. Becker, and Z. Ghahramani (Eds.), Advances in Neural Information Processing Systems 14. MIT Press

The code:

%# parameters to tune
SIGMA = 2e-3;       %# controls Gaussian kernel width
NUM_CLUSTERS = 4;   %# specify number of clusters

%% Loading and preparing a sample image
%# read RGB image, and make it smaller for fast processing
I0 = im2double(imread('house.png'));
I0 = imresize(I0, 0.1);
[r,c,~] = size(I0);

%# reshape into one row per-pixel: r*c-by-3
%# (with pixels traversed in columwise-order)
I = reshape(I0, [r*c 3]);

%% 1) Compute affinity matrix
%# for each pair of pixels, apply a Gaussian kernel
%# to obtain a measure of similarity
A = exp(-SIGMA * squareform(pdist(I,'euclidean')).^2);

%# and we plot the matrix obtained
imagesc(A)
axis xy; colorbar; colormap(hot)

%% 2) Compute the Laplacian matrix L
D = diag( 1 ./ sqrt(sum(A,2)) );
L = D*A*D;

%% 3) perform an eigen decomposition of the laplacian marix L
[V,d] = eig(L);

%# Sort the eigenvalues and the eigenvectors in descending order.
[d,order] = sort(real(diag(d)), 'descend');
V = V(:,order);

%# kepp only the largest k eigenvectors
%# In this case 4 vectors are enough to explain 99.999% of the variance
NUM_VECTORS = sum(cumsum(d)./sum(d) < 0.99999) + 1;
V = V(:, 1:NUM_VECTORS);

%% 4) renormalize rows of V to unit length
VV = bsxfun(@rdivide, V, sqrt(sum(V.^2,2)));

%% 5) cluster rows of VV using K-Means
opts = statset('MaxIter',100, 'Display','iter');
[clustIDX,clusters] = kmeans(VV, NUM_CLUSTERS, 'options',opts, ...
    'distance','sqEuclidean', 'EmptyAction','singleton');

%% 6) assign pixels to cluster and show the results
%# assign for each pixel the color of the cluster it belongs to
clr = lines(NUM_CLUSTERS);
J = reshape(clr(clustIDX,:), [r c 3]);

%# show results
figure('Name',sprintf('Clustering into K=%d clusters',NUM_CLUSTERS))
subplot(121), imshow(I0), title('original image')
subplot(122), imshow(J), title({'clustered pixels' '(color-coded classes)'})

... and using a simple house image I drew in Paint, the results were:

Eigenvector computation using OpenCV

Eigenvector computation using OpenCV

and by the way, the first 4 eigenvalues used were:

1.0000
0.0014
0.0004
0.0002

and the corresponding eigenvectors [columns of length r*c=400]:

-0.0500    0.0572   -0.0112   -0.0200
-0.0500    0.0553    0.0275    0.0135
-0.0500    0.0560    0.0130    0.0009
-0.0500    0.0572   -0.0122   -0.0209
-0.0500    0.0570   -0.0101   -0.0191
-0.0500    0.0562   -0.0094   -0.0184
......

Note that there are step performed above which you didn't mention in your question (Laplacian matrix, and normalizing its rows)


I would recommend this article. The author implements Eigenfaces for face recognition. On page 4 you can see that he uses cvCalcEigenObjects to generate the eigenvectors from an image. In the article the whole pre processing step necessary for this computations are shown.


Here's a not very helpful answer:

What does theory (or maths scribbled on a piece of paper) tell you the eigenvectors ought to be ? Approximately.

What does another library tell you the eigenvectors ought to be ? Ideally what does a system such as Mathematica or Maple (which can be persuaded to compute to arbitrary precision) tell you the eigenvectors ought to be ? If not for a production-sixed problem at least for a test-sized problem.

I'm not an expert with image processing so I can't be much more helpful, but I spend a lot of time with scientists and experience has taught me that a lot of tears and anger can be avoided by doing some maths first and forming an expectation of what results you ought to get before wondering why you got 0s all over the place. Sure it might be an error in the implementation of an algorithm, it might be loss of precision or some other numerical problem. But you don't know and shouldn't follow up those lines of inquiry yet.

Regards

Mark

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜