开发者

Adding a log in asymptotic analysis

Have a problem I'm trying to work through and would very much appreciate some assistance! What's the time complexity of...

for (int j = 1 to n) {
    k = j;
    while (k < n) {
        sum += a[k] * b[k];
        k += log n;
    }
}

The outer for loop runs n times. I'm not sure how to deal with k+= log n in the inner loop. My thought is that it's O(n^2). Adding log(n)开发者_JAVA技巧 to k isn't quite getting an additional n loops, but I think it is less than O(n*log n) would be. Obviously, that's just a guess, and any help in figuring out how to show that mathematically would be greatly appreciated!


You can treat log(n) as a constant here, sort of.

Each iteration of the loop will perform a constant amount of work (sum+=...; k+=...) a number of times equal to n/log(n). There are n iterations of the loop. The total work is thus n^2 / log(n).

Any time you see a bunch of operations like so:

 ---------------------b-------------------------
 |O(blah) + O(blah) + O(blah) + O(blah) + O(blah)
 |O(blah) + O(blah) + O(blah) + O(blah)     .
a|O(blah) + O(blah) + O(blah)               .
 |O(blah) + O(blah)                         .
 |O(blah)     .         .         .         .

It is a*b * O(blah) -- just imagine the square (where I put the .s). It is a constant fraction of a 2D rectangle (half of a rectangle), hence the O(a*b).

In the above case, b=n, a=n/log(n), and O(blah)=O(1)(from the inner loop)


You can quite easily "just sump up":

The outer loop has as you said n steps. The inner loop has (k-j) / log n steps.

That's (roughly):

 n
---
\     (n-j)             n*n
/   -------- = ... = ---------
___  (log n)          2*log(n)
j=1

So, it's O((n^2)/log(n)) in total.


You can proceed formally like the following:

Adding a log in asymptotic analysis

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜