Is log(n) = Ω(n)?
I believe it's not. The definition is that:
log(n) >= c*n for some n = x, and all n > x
The reason I think it's not is th开发者_运维百科at the rate of growth of c*n = c. The rate of growth of log(n) = 1/n. So, as n-> infinity, the rate of growth of n approaches 0, whereas c, the rate of growth of c*n, is constant. Given that eventually log(n) will eventually grow slower than any n*c, where c > 0, n*c will outgrow log(n).
So, a few questions.
- Can I assume c > 0 from the definition of big omega?
- Is my above intuition correct?
- I'm conflicted about my proof above. Because for very small c's, log(n) = cn very early, my assumption above implies that they would intersect again, which means log(n) = cn has more than one solution, which seems wrong.
I'm very confused and appreciate the help!
1- c cannot be 0 or negative, so you can assume that.
2- The growth of log(n)
is lower than the growth of n
, for every n > 1
, for example. As Ω(n) is the set of functions that "grow more" than the function f(n) = n
, log(n) is not Ω(n). But you could say n = Ω(log(n)), although that would not be an asymptotic tight bound.
3- The definition states that the inequality may be valid starting from one value n0
. If some n0
exists, you can say that. But in this case (log(n) = Ω(n)), it doesn't, because it must be valid for every n >= n0
. And for any big value, log(n)'s growth is lower than n's growth.
No, actually the opposite is true.
- A running time of
log(n)
function isO(n)
[read : upper bounded by some polynomial fn whose highest order term is n
]
and a
fn
having running-time ofn
isΩ(log(n))
[read : lower-bounded by some polynomial fn whose highest order term is
log(n)
]
Regarding your intuition, it's perfectly correct. log(n)=cn
meets only at a single point.
精彩评论