Linear complexity and quadratic complexity
I'm just not sure...
If you have a code that can be executed in either of the following complexities:
- A sequence of O(n), like for example: two O(n) in sequence
- O(n²)
The preferred version would be the one that can be executed in linear time. Would there be a time such that the sequence of O(n) woul开发者_C百科d be too much and that O(n²) would be preferred? In other words, is the statement C x O(n) < O(n²) always true for any constant C?
Why or why not? What are the factors that would affect the condition such that it would be better to choose the O(n²) complexity?
I think there are two issues here; first what the notation says, second what you would actually measure on real programs
big O is defiend as a limit as n -> infinity so in terms of big O, O(n) < O(n^2) is always true regardless of any finite constants.
as others have pointed out real programs only ever deal with some finite input, so it is quite possible to pick a small enough value for n such that the c*n > n^2 i.e. c > n, however you are strictly speaking no longer dealing with big O
If your constant C is greater than your value of n, then the O(n²) algorithm would be better.
There is always an implied constant in O notation, so yes, it's possible that for sufficiently small n that O(n^2) may be faster than O(n). This would happen if the constant for O(n) was much smaller than that for O(n^2).
C x O(n) < O(n²) is NOT always true, there is a point in n where it reverses the condition.
When C is large and n is small, then C x O(n) > O(n²). However, C is always constant, hence when n scales to a large number, C x O(n) < O(n²).
Therefore, when n is large, O(n) is always better than O(n²).
精彩评论