Murphy’s Law

Why Trust Techopedia

What Does Murphy’s Law Mean?

Murphy’s Law is an aphorism which states that anything that can go wrong will – at any undefined moment. Commonly applied to IT, Murphy’s Law is highly relevant to software/hardware testing and other types of engineering. It is also based on a scientific view of natural or created systems.

Advertisements

Techopedia Explains Murphy’s Law

In scientific terms, Murphy’s Law relies on the concept of entropy, also known as the second law of thermodynamics. Entropy, which posits the eventual tendency of an ordered system toward disorder, has been a mainstay of a number of correlate theories. Both entropy and Murphy’s Law, for example, suggest that during a test performed hundreds of times, testing equipment will manifest faults, testers that fail to follow exact procedures or other types of unexpected errors and problems, based on their overall probability.

Since its conceptual origin in the mid-19th century, Murphy’s Law has been used in testing and the general sphere of IT. Humans interacting with technology are often confused or frustrated by a myriad of different problems, some related to equipment function and others to human error. Murphy’s Law is a method of anticipating and talking about various types of potential technological problems as they occur over time.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.