What is the Average Big-Ο complexity of Gnome sort?
There is nothing about it in wikipedia.
Anyone knows that?I only want to know the average Big-O Complex开发者_JAVA技巧ity of that algorithm.
The performance of the gnome sort algorithm is at least O(f(n))
where f(n)
is the sum for each element in the input list of the distance to where that element should end up. For a "random" list of length L
, an element at the beginning of the list is expected to be an average of L / 2
distance away from its sorted location. An element in the middle of the list is expected to be an average of L / 4
distance away from its sorted location. Since there are L
total elements, f(n)
is at least L * L / 4
. Therefore, on average, gnome sort is O(n * n)
.
Sorry if this is hard to follow.
Here is a simple comparison of bubble and gnome sort of an array of random values, values in reverse order, 3 concatenated arrays of ordered values and ordered values. Gnome sort on average seems to be a bit cheaper on the comparison side of things.
Note that the comparisons/swaps when sorting random values is always a bit different, but close to these results.
N = 100, attempts = 1000
random:
bubble sort: comparisons = 8791794, swaps = 2474088
gnome sort: comparisons = 5042930, swaps = 2474088
reversed:
bubble sort: comparisons = 9900000, swaps = 4950000
gnome sort: comparisons = 9900000, swaps = 4950000
3 ordered sets:
bubble sort: comparisons = 6435000, swaps = 1584000
gnome sort: comparisons = 3267000, swaps = 1584000
ordered:
bubble sort: comparisons = 99000, swaps = 0
gnome sort: comparisons = 99000, swaps = 0
... And here is the code used to get these results:
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
const int N = 100;
int x[N];
int main()
{
srand((unsigned int)time(0));
int comparisons = 0;
int swaps = 0;
int attempts = 1000;
while (--attempts >= 0)
{
// random:
for (int i = 0; i < N; ++i)
x[i] = rand();
// reversed:
/*for (int i = 0; i < N; ++i)
x[i] = N - 1 - i;*/
// 3 ordered sets:
/*for (int i = 0; i < N/3; ++i)
x[i] = i;
for (int i = N/3, j = 0; i < 2*N/3; ++i, ++j)
x[i] = j;
for (int i = 2*N/3, j = 0; i < N; ++i, ++j)
x[i] = j;*/
// ordered:
/*for (int i = 0; i < N; ++i)
x[i] = i;*/
// bubble sort:
/*{
bool swapped;
do
{
swapped = false;
for (int i = 0; i < (N - 1); ++i)
{
++comparisons;
if (x[i] > x[i + 1])
{
++swaps;
int t = x[i];
x[i] = x[i + 1];
x[i + 1] = t;
swapped = true;
}
}
} while (swapped);
}*/
// gnome sort:
{
int i = 1;
while (i < N)
{
++comparisons;
if (x[i] >= x[i - 1])
++i;
else
{
++swaps;
int t = x[i];
x[i] = x[i - 1];
x[i - 1] = t;
if (i > 1)
--i;
}
}
}
}
printf("comparisons = %d\n", comparisons);
printf("swaps = %d\n", swaps);
}
Obviously this is not a full test by far, but it gives an idea.
Rather the contrary, Wikipedia says it's O(n2), and from the description, I can't see how there would be any real doubt about that.
"Average" cannot really be answered without looking at the input data. If you know what you are sorting you could do some analysis to get a better idea how it would perform in your application.
It seems intuitive to me that if insertion sort has an average-case running time that is O(n^2), and gnome sort is a slightly worse version of insertion sort, then gnome sort's average running time would also be O(n^2) (well, Θ(n^2)).
This pdf has this to say about insertion-sort's average-case running time of Θ(n^2):
The proof of this is not trivial, but it is based on the intuitive fact that on the average, the while loop test "list[pos-1] > value" is true about half the time, so that on average, the number of executions of the while loop is one-half of the maximum number. Since the maximum number is n(n-1)/2, the average number of executions of the while loop is n(n-1)/4 , which is still Θ(n^2).
The same reasoning would apply to gnome sort. You know gnome sort can't be better because the "gnome" first has to scan backwards (via swapping) to find where the item goes (equivalent to insertion sort's scan forward), and then has to walk back up the list looking at elements that it's already sorted. Any run-time differences between scanning methods I believe are negligible to the complexity bound, but I'll defer to you to prove it.
Wikipedia clearly says that it has a worst case run time of O(n**2).
http://en.wikipedia.org/wiki/Gnome_sort
This is O(n^2)
. Worst case scenario, every time you entered the while the current position needs to be exchanged with the previous position, which makes pos smaller and you have to exchange again. Test ascending sorting on a descendent sorted array (ie to get [1 2 3 4] from [4 3 2 1] you'd have the worst case).
精彩评论