Shannon's law is stated as shown below:
C = B log2< (1 + S/N) where:
C is the highest attainable error-free data speed in bps that can be handled by a communication channel.
B is the bandwidth of the channel in hertz.
S is the average signal power received over the bandwidth calculated in watts (or volts squared).
N is the average interference power or noise over the bandwidth calculated in watts (or volts squared)
S/N is the signal-to-noise ratio (SNR) of the communication signal to the Gaussian noise interference depicted as the linear power ratio.
The function log2 signifies the base-2 logarithm. All logarithms are exponents. Assuming that x and y are two numbers, the base-2 logarithm of x is y, provided that 2y = x.
Shannon’s explanation of information for communication networks helps to identify the important relationships between several network elements.
Shannon’s equations helps engineers determine the amount of information that could be carried over the channels associated with an ideal system. Shannon’s is still the base for engineers and communication scientists in their never-ending quest for faster, more robust, and more energy-efficient communication systems. He showed the data compression principles mathematically and also showed how controlled error rates can be used to assure integrity when information is carried over noisy channels.
Practical communications systems that can be operated close to the theoretical speed limit described by Shannon's law have not yet been devised. Some systems that employ advanced encoding and decoding are able to achieve 50 percent of the limit specified by the Shannon for a channel with fixed signal-to-noise ratio and bandwidth.