complexity analysis of algorithm
here is code, which fills two dimensional array with random genarated numbers in range [1 19] without duplication, my question is: how to determine it's complexity?
For example, I see that its runn开发者_如何学Cing time is at least O(n^2), because of its inner and outer cycles, but that about the goto
statement?
Here is my code:
#include <iostream>
#include <set>
#include <cstdlib>
using namespace std;
int main()
{
int min=1;
int max=19;
int a[3][3];
set<int>b;
for (int i=0; i<3; i++)
{
for (int j=0; j<3; j++)
{
loop:
int m=min+rand()%(max-min);
if (b.find(m)==b.end())
{
a[i][j]=m;
b.insert(m);
}
else
goto loop;
}
}
for (int i=0; i<3; i++)
{
for (int j=0; j<3; j++)
cout<< a[i][j]<<" ";
cout<<endl;
}
return 0;
}
I would say that complexity of algorithm is c*O(n^2) where c is some constant, it is because if it finds duplicated element inside cycles it repeats generation of random numbers and takes some constant time, am I right?
As the likelihood of getting a working number decreases, the number of goto-loops increases. For a uniform random number generator, the behavior is linear with respect to the number of.. numbers. It definitely doesn't add a constant to your complexity.
If n is the number of elements in a, then it'll on average scale with O(n²). (or if n is the number of rows in the square matrix a; O(n⁴)).
A much simpler implementation would be using Fisher-Yates shuffle
It's O(infinity). The O notation gives an upper bound. Because of your use of rand()
in a loop, there's no guarantee that you will make progress. Therefore, no upper bound exists.
[edit] Ok, people also want other complexities than the conventional, worst-case complexity.
The worst-case complexity obtained by assuming that the RNG generates an infinite series of ones; this means that even the first loop iteration doesn't finish. Therefore there's no finite upper bound on the run time, O(infinity).
The best-case complexity is obtained by assuming that the RNG generates sequential numbers. That means the cost of each iteration is O(log N) (set::find
), and there are O(N)*O(N) iterations, so the upper bound is O(N2 log N).
The average case complexity is harder. Assuming that max = k*N*N
for some k
> 1, the RNG will succesfully pick an "unused" number in O(1) time. Even after N*N numbers are chosen, there are still (k-1) unused numbers, so the chance p of picking an unused number is p >= (k-1)*(N*N)/k*(N*N) <=> p>= (k-1)/k
. That means we can expect to pick an unused number in k/(k-1)
attempts, which is independent of N and therefore O(1). set::find
still dominates the cost of each iteration, at O(log N). We still have the same number of iterations, so we get the same upper bound of O(N2 log N)
The goto loops until a random number equals a given one. if the distribution of random numbers is uniform, "retry ... until" is "linear in average" respect to the amplitude of the range. But this linearity gos to multiply the complexity of set::find (log(n)) (ste::insert just happen once)
The two external for are based on constants (so their timing doesn't depend on the data), hence they just multiply the time, but don't increase complexity.
"Complexity" is not about how much absolute time (or space) your program takes. It is about how much the time (or space) increases when you increase the size of your program's input data.
(BTW O for time and O for space may be different.)
Time Complexity
Assuming n is number of elements in the matrix, you have to ask yourself what happens when you add a single element to your matrix (i.e. when n becomes n+1):
- You need to iterate over the new element, which is O(1). We are talking about one iteration here, so double loop does not matter.
- You have another iteration for printing, which is also O(1), assuming
cout<<
is O(1). - You have to
find
the element which is O(log(n)) - thestd::set
is typically implemented as a red-black tree. - You have to retry the
find
(viagoto
) potentially several times. Depending onrnd
,min
,max
and the width ofint
, number of retries may be O(1) (i.e. it does not increase with increase in number of elements) or it may be worse than that. - You have to
insert
the element which is O(log(n)).
Assuming the "best" rnd
, you are looking at the following increase for one element...
(O(1) + O(1)) * (O(log(n)) * O(1) + O(log(n)) = O(1) * O(log(n)) = O(log(n))
...so for n elements, your complexity is:
(O(n) + O(n)) * (O(log(n)) * O(1) + O(log(n)) = O(n) * O(log(n)) = O(n * log(n))
Assuming "bad" rnd
of O(n), you are looking at...
(O(n) + O(n)) * (O(log(n)) * O(n) + O(log(n)) = O(n) * O(n * log(n)) = O(n^2 * log(n))
Space Complexity
Your matrix is O(n) and std::set
is O(n) so you are O(n) here overall.
精彩评论