开发者

vectorizing loops in Matlab - performance issues

This question is related to these two:

Introduction to vectorizing in MATLAB - any good tutorials?

filter that uses elements from two arrays at the same time

Basing on the tutorials I read, I was trying to vectorize some procedure that takes really a lot of time.

I've rewritten this:

function B = bfltGray(A,w,sigma_r)
dim = size(A);
B = zeros(dim);
for i = 1:dim(1)
    for j = 1:dim(2)

        % Extract local region.
        iMin = max(i-w,1);
        iMax = min(i+w,dim(1));
        jMin = max(j-w,1);
        jMax = min(j+w,dim(2));
        I = A(iMin:iMax,jMin:jMax);

        % Compute Gaussian intensity weights.
        F = exp(-0.5*(abs(I-A(i,j))/sigma_r).^2);
        B(i,j) = sum(F(:).*I(:))/sum(F(:));

    end
end

into this:

function B = rngVect(A, w, sigma)
W = 2*w+1;
I = padarray(A, [w,w],'symmetric');
I = im2col(I, [W,W]);
H = exp(-0.5*(abs(I-repmat(A(:)', size(I,1),1))/sigma).^2);
B = reshap开发者_运维问答e(sum(H.*I,1)./sum(H,1), size(A, 1), []);

Where

A is a matrix 512x512

w is half of the window size, usually equal 5

sigma is a parameter in range [0 1] (usually one of: 0.1, 0.2 or 0.3)

So the I matrix would have 512x512x121 = 31719424 elements

But this version seems to be as slow as the first one, but in addition it uses a lot of memory and sometimes causes memory problems.

I suppose I've made something wrong. Probably some logic mistake regarding vectorizing. Well, in fact I'm not surprised - this method creates really big matrices and probably the computations are proportionally longer.

I have also tried to write it using nlfilter (similar to the second solution given by Jonas) but it seems to be hard since I use Matlab 6.5 (R13) (there are no sophisticated function handles available).

So once again, I'm asking not for ready solution, but for some ideas that would help me to solve this in reasonable time. Maybe you will point me what I did wrong.

Edit:

As Mikhail suggested, the results of profiling are as follows:

65% of time was spent in the line H= exp(...)

25% of time was used by im2col


How big are I and H (i.e. numel(I)*8 bytes)? If you start paging, then the performance of your second solution is going to be affected very badly.

To test whether you really have a problem due to too large arrays, you can try and measure the speed of the calculation using tic and toc for arrays A of increasing size. If the execution time increases faster than by the square of the size of A, or if the execution time jumps at some size of A, you can try and split the padded I into a number of sub-arrays and perform the calculations like that.

Otherwise, I don't see any obvious places where you could be losing lots of time. Well, maybe you could skip the reshape, by replacing B with A in your function (saves a little memory as well), and writing A(:) = sum(H.*I,1)./sum(H,1);

You may also want to look into upgrading to a more recent version of Matlab - they've worked hard on improving performance.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜