Thread:Mathmagician/@comment-3562424-20121121052230/@comment-4674838-20121122033620

http://en.wikipedia.org/wiki/Sorting_algorithm is a good place to start. It lists a ton of different sorting algorithms and has links to individual pages for each one that explain how it works.

$$O(nlogn)$$ is an example of what's called "Big O" notation: http://en.wikipedia.org/wiki/Big_O_notation

The concept is a mathematical one in nature. Basically, the idea is to consider how a function behaves "in the long run as n approaches infinity"

E.g. $$7n^2 + 3n + 1000000 = O(n^2)$$

What this means is that, as n approaches infinity (think n = a googleplex or some other massive number), pretty much the $$n^2$$ part is the only piece that matters. Everything else is negligibly small in comparison. We don't care so much about the 7, the 3n, or the 1000000.

The rigorous definitions behind this are based in limits in calculus.

Anyways, the Big O notation is commonly used in computer science to talk about how algorithms scale in extreme cases.

If you have an algorithm that sorts an array with N elements -- you might wonder how much time that algorithm takes when N is a really big number? So you'd model the time complexity as a function of the size of the array, say $$T(N)$$, and then try to figure out the Big O limit approximation of that function to see how it behaves as N approaches infinity.