Thread:Mathmagician/@comment-3562424-20121121052230/@comment-1474707-20121123075829

Kangaroopower - as an example for Merge sort, think about what mergesort does. It splits the list roughly in half, sorts each half by splitting that in half, sorts each quarter by splitting them in half, down to where it's just got a load of 1-element lists to sort (n of them to be precise).

Now it's got to merge them together. To merge two lists of size k each, it's got to run through each list at most once (perhaps less, but worst case scenario it has to run through each list comparing every element), so at worst case merging two size k lists takes 2k comparisons.

In the first stage, it's got to merge the one element lists pairwise, so it merges the n/2 pairs and each merge takes <= 2x1 operations. So n/2 * 2 = n operations. Now the next stage - it has to merge n/4 2-element sets and each merge takes <= 2x2 operations. So n/4 * 4 = n operations. Next stage, n/8 4-element sets, so again <= n operations. Hopefully you can see that every stage of the merge takes n operations. So how many stages are there? Well, if you split something in half repeatedly until you get 1-element lists, you will have logn stages (to the base 2). So there are logn stages, each stage takes n operations, so the whole system takes nlogn operations.

As it happens, because complexity is only upto a constant, logn to the base 2 is equivalent to logn to any base, so generally you don't need to specify the base of the log when it's in the big O notation.