Thread:Mathmagician/@comment-3562424-20121121052230/@comment-4674838-20121123032116

That's something that you'd normally learn at university in a computer science "algorithms" class. Of course that's a bit more than I can explain.

To give you a very basic example, suppose I have an array called of size N and I perform the following algorithm on that array:

Let $$T(N)$$ be the time complexity:
 * How much time does A* take? Well, it consists of 2 operations that are independent of the size of the array N. So A* is constant time and the amount of time is $$2$$
 * How much time does B* take? Well, C* is a single operation, constant time. However, that operation happens a total of N times because it gets iterated by the loop. So ultimately, B* is linear time and the time is $$N$$
 * How much time does D* take? Well, E* consists of 3 operations in constant time. That gets iterated N times by the inner loop, so the time is $$3*N$$. But then that gets iterated N times by the outer loop, for a total of $$(3*N)*N$$. This is quadratic time.
 * A*, B* and D* happen in succession, so the amount of time for these operations gets added together

So time complexity is: $$T(N) = 2 + N + 3N^2$$. And we would then say that $$T(N) = O(N^2)$$ -- that is, the algorithm completes in quadratic time.