Information Theory

What Does Information Theory Mean?

Information theory is a branch of mathematics that defines efficient and practical methods by which data can be exchanged and interpreted. The concept originated in a mid-twentieth century essay by a mathematician by the name of Claude Shannon, which set many important precedents for digital technology, including the usage of bits as units of measurement.

Advertisements

Techopedia Explains Information Theory

Prior to information theory, electronic communication was conducted mostly through analog transmission, which worked well enough in short distances but became problematic as the distances increased and signals degraded. Claude Shannon was an employee of Bell Labs (the research and development arm of the Bell Telephone Company) during the mid-twentieth century, and worked on improving electronic communication during the second World War in order to make it more efficient and secure.

Shannon’s research was eventually published in a book called “The Mathematical Theory of Communication” (co-written with Warren Weaver) and laid the groundwork for much of modern digital technology, such as the implementation of binary code.

Advertisements

Related Terms

Latest Computer Science Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…