Definition - What does Murphy's Law mean?
Murphy’s Law is an aphorism which states that anything that can go wrong will - at any undefined moment. Commonly applied to IT, Murphy’s Law is highly relevant to software/hardware testing and other types of engineering. It is also based on a scientific view of natural or created systems.
Techopedia explains Murphy's Law
In scientific terms, Murphy’s Law relies on the concept of entropy, also known as the second law of thermodynamics. Entropy, which posits the eventual tendency of an ordered system toward disorder, has been a mainstay of a number of correlate theories. Both entropy and Murphy’s Law, for example, suggest that during a test performed hundreds of times, testing equipment will manifest faults, testers that fail to follow exact procedures or other types of unexpected errors and problems, based on their overall probability.
Since its conceptual origin in the mid-19th century, Murphy's Law has been used in testing and the general sphere of IT. Humans interacting with technology are often confused or frustrated by a myriad of different problems, some related to equipment function and others to human error. Murphy’s Law is a method of anticipating and talking about various types of potential technological problems as they occur over time.
Techopedia Deals: The Complete Android Developer Course - Build 14 Apps
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: