Asymptotic Running Time of an Algorithm
When we analyze the running time of an algorithm, it's usually
not worth to compute the exact running time.
For large input sizes only the order of growth of the running time - the asymptotic running time - is relevant.
Most times it does not matter if an algorithm is 2, 3 or 10 times faster than an other algorithm. Constant factors are dwarfed by the effect of the input size.
- Sorting a billion numbers with the Bubblesort algorithm (running time O(n2)) on ASCI White,
the world's fastest computer, takes about a week.
Sorting a billion numbers with Heapsort (running time O(n log n)) on a 100 MHz Pentium PC takes about 10 minutes.
The real question to ask when we analzye an algorithm's efficieny is:
If we double the input size, how does this affect the running time of the program? Does it run twice as long? Or four times as long?
Or even longer?