Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
Information theory is a branch of mathematics that defines efficient and practical methods by which data can be exchanged and interpreted. The concept originated in a mid-twentieth century essay by a mathematician by the name of Claude Shannon, which set many important precedents for digital technology, including the usage of bits as units of measurement.
Prior to information theory, electronic communication was conducted mostly through analog transmission, which worked well enough in short distances but became problematic as the distances increased and signals degraded. Claude Shannon was an employee of Bell Labs (the research and development arm of the Bell Telephone Company) during the mid-twentieth century, and worked on improving electronic communication during the second World War in order to make it more efficient and secure.
Shannon’s research was eventually published in a book called "The Mathematical Theory of Communication" (co-written with Warren Weaver) and laid the groundwork for much of modern digital technology, such as the implementation of binary code.