Big O Notation

Why Trust Techopedia

What Does Big O Notation Mean?

Big O notation is a particular tool for assessing algorithm efficiency. Big O notation is often used to show how programs need resources relative to their input size.

Advertisements

Big O notation is also known as Bachmann–Landau notation after its discoverers, or asymptotic notation.

Techopedia Explains Big O Notation

Essentially, using big O notation helps to calculate needs as a program scales. The size of a program’s input is given to the computer, and then the running time and space requirements are determined. Engineers can get a visual graph that shows needs relative to different input sizes.

Big O notation is also used in other kinds of measurements in other fields. It is an example of a fundamental equation with a lot of parameters and variables. A full notation of the big O notation equation can be found online.

Advertisements

Related Terms

Margaret Rouse
Technology expert
Margaret Rouse
Technology expert

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.