Definition - What does Computational Complexity mean?
Computational complexity is a computer science concept that focuses on the amount of computing resources needed for particular kinds of tasks. In computational complexity theory, researchers assess the kinds of resources that will be needed for a given type or class of task in order to classify different kinds of tasks into various levels of complexity.
Techopedia explains Computational Complexity
Although computational complexity is in some ways similar to the analysis of algorithms, it is essentially its own branch of mathematical theory. Some think of this approach as a measurement of how much work it would take to solve a particular problem or to achieve a particular task. Different kinds of analysts use computational complexity research to find which parts of a task may be most difficult for a computing system, or to figure out how to most efficiently complete some project. Although some developers might consider computational complexity to be irrelevant to their work, others have pointed out that successfully changing tasks or algorithms from a higher complexity class to a lower complexity class can make them work much better. Programmers and developers who use computational complexity theory on items like nested loops, logic trees or other kinds of rhythms can build more efficient systems with a better understanding of how to create less resource-hungry processes.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: