开发者

Big-Oh, Concequence of a Definition

I have spent a lot of time reading questions and answers about Big-Oh on both here and math.stackexchange and seems that this is the best place for it as math.stackexchange don't seem to like questions of this sort. So I have been given some coursework at uni on my CS course and I don't fully understand it and was hoping you guys could help. I understand that "homework" questions are slightly frowned upon here so I have chosen another example that is not part of my coursework, but is of similar style.

So here is the definition that I have been given in the notes:

Big-Oh, Concequence of a Definition

And the question I have been given is:

Us开发者_如何学Cing Definition 2.5 show that if f(n) is O(g(n)) then k + f(n) is also O(g(n)).

I have spent 3 days searching the web for any kind of answer to problems like these. Looking at definition 2.5 it says f(n) is O(g(n)) and k + f(n) is O(g(n)). That's enough for me, but it seems I have to prove how that is derived. I thought at first that it should be done somehow by induction but have since decided against that and there must be a simpler way.

Any help would be appreciated. I don't expect someone to just upright give me the answer. I would more prefer either a methodology or a reference to where I can learn the technique of doing this. Could I remind you again that this is not my actual coursework but a question of similar style.

Thanks in Advance.


suppose f(n) is O(g(n))
then there exists a c and a k' s.t. for all n > k': f(n) <= cg(n)
now consider f(n) + k
let d be s.t k <= d*g(n) for all n greater than k'
which you know is possible because k is in O(1)
then
f(n) + k <= cg(n) + dg(n) = (d+c)(g(n))
Then you use the definition and substitute d+c for c, ==> f+k is in O(g)


f(n) <= cg(n)

k + f(n) <= c'g(n) where c' = ck

so k + f(n) is O(g(n))


Then k is O(1), f(n) is a O(g(n)) then you can sum this values then you have O(1+g(n)) this is O(g(n));

f(n) is O(g(n)) then k + f(n) is also O(g(n)), because you have writed in your book

Ignore addition of a constant

Constant are always ignored because can't change Big-O notation, any constant is O(1) in Big-O notation.


For what it's worth, this is a somewhat contrived definition of big-O notation. The more general and, in my opinion, more intuitive definition is that f(n) ~ O(g(n)) as n->a iff lim|f(n)/g(n)| <= A as n->a for some finite real number A.

The important part is that a limit context is required. In CS, that limit is taken implicitly to be infinity (since this is what n tends to as the problem size increases), but in principle it can be anything. For example, sin(x) ~ O(x) as x->0 (in fact, it's exactly asymptotic to x; this is the small angle approximation).

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜