Generic and practical sorting algorithm faster than O(n log n)?
Is there any practical开发者_StackOverflow社区 algorithm for generic elements (unlike counting sort or bucket sort) that runs faster than O(n log n)?
Many people have mentioned the information-theoretic Ω(n lg n) bound on comparison sorting algorithms, which can't be broken in comparison sorts. (This earlier question explores why that's the case.)
However, there are some types of comparison sorts that, while not breaking O(n lg n) in the average case, can be shown to run faster on inputs that are already presorted to some extent. For example, Dijkstra's smoothsort runs in O(n) on already-sorted inputs with O(n lg n) worst-case behavior. One of my favorite sorts, Cartesian tree sort, provably takes optimal advantage of presortedness in a few metrics. For example, it can sort any sequence with a constant number of increasing or decreasing subsequences in time O(n), degrading gracefully to O(n lg n) in the worst case.
On the subject of non-comparison sorts, there are some famous but tricky sorting algorithms for integers that surpass O(n lg n) bynp doing clever bit-manipulation tricks. The best known integer sorting algorithm is a randomized algorithm that can sort in O(n √lg lg n), while the fastest deterministic algorithm for integer sorting runs in O(n lg lg n) time. You may have heard that radix sort works in O(n), though technically it's O(n lg U), where U is the largest value in the array to sort.
In short, no, you can't do much better than O(n lg n), but you can do marginally better if you know something about your input.
For generic elements that you can only compare and not access the internals of, it is impossible to have a sorting algorithm faster than Theta(n log n). That is because there are n! (n factorial) possible orders of the elements, and you need Theta(n log n) comparisons to distinguish all of them.
No. This is one of the few rigorous minimum bounds for algorithms we have. For a collection of n elements, there are n! different orders, so to specify a given order we need log(n!) bits. By Stirling's approximation this is approximately n log n. For each comparison we do between elements, we get essentially one bit of information (ignoring the possibility of equal elements).
For how many elements? Even though it's something like N1.2, a Shell-Metzner sort is often faster than most others up to a few thousand elements (or so).
It also depends on what you mean by "generic" and "practical". A radix sort can beat O(n log n), and it works for a fairly wide variety of data (but definitely not everything).
If your idea of practical and generic limits the algorithm to one that directly compares elements, then no -- nothing does (or ever can) be better than O(n log n). That's been proven for quite some time.
精彩评论