URL | Description |
---|---|
Big "O" notation - order notation | Order notation, or Big "O" notation, is a measure of the running time of an algorithm, as it relates to the size of the input to that algorithm. It is intended, not to measure the performance of the machine on which the algorithm is run, but rather to strictly measure the performance of the algorithm itself. Thus, since different machines can vary in their speeds by some constant factor, we remove all constant factors from consideration when we talk about order notation. For example O(2) and O(1) are considered to be the same. Similarly, O(n) is the same as O(2n), and the same as O(100n) |