开发者

Why does counter = counter /2; have O(log(n))?

I know the following code has a complexity of O(log(n)):

while (n>1)
{
    counter++;
    n/=2;
}

I understand that here, n is being divided in half on each iteration, meaning that if n was 1000 then it will take ten rounds to get out of the loop. How did this lead to O(log(n))?

Sorry for the simple question, I really tried my best to get it before 开发者_如何学CI asked.


Each time through the loop, you divide by 2 (roughly; this will ignore rounding since it is an asymptotic argument). So if n = N at the start, after k iterations, n=N/(2^k). To arrive at n = 1, you have to satisfy 2^k = N. That is, k = log(N).


The recurrence relation would be

 T(n) = T(n/2) + O(1)

Trying to solve it using Master's theorem will give the running time of T(n) as O(log n) (similar to what you get in Binary Search).


Imagine that n is 2^x (e.g. 2^5=32, 2^10=1024 etc), so that the counter is incremented x times within the loop. By definition, x is a base 2 log n.


By definition, logarithms aren't linear. In other words, they change by different amounts depending on the input. In your example, the first step takes n down by 500, while the fifth step reduces it by only 32. The closer you get to one, the slower n decreases. This "deceleration" is exactly the kind of behavior you get with a log.


A simple hand-wavey explanation: What happens if you double n? Does the runtime double (that would be O(n))? No, the runtime increases by only one step. This is typical of O(log n).

[OTOH, if you squared n (say it increased from 4 to 16), then you find the number of steps double. Again, indicative of O(log n).]

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜