Big-O Notation |
---|

Definition of Big-O Notation: |

A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items; a mathematical notation used to describe the asymptotic behavior of functions. More precisely, it is used to describe an asymptotic upper bound for the magnitude of a function in terms of another, usually simpler, function. (from NIST) |

Big-O Notation is part of the Conceptual Entities group |