Don't miss an insight. Subscribe to Techopedia for free.


Big O Notation

What Does Big O Notation Mean?

Big O notation is a particular tool for assessing algorithm efficiency. Big O notation is often used to show how programs need resources relative to their input size.


Big O notation is also known as Bachmann–Landau notation after its discoverers, or asymptotic notation.

Techopedia Explains Big O Notation

Essentially, using big O notation helps to calculate needs as a program scales. The size of a program’s input is given to the computer, and then the running time and space requirements are determined. Engineers can get a visual graph that shows needs relative to different input sizes.

Big O notation is also used in other kinds of measurements in other fields. It is an example of a fundamental equation with a lot of parameters and variables. A full notation of the big O notation equation can be found online.


Related Terms