What Does Error Detection Mean?
In networking, error detection refers to the techniques used to detect noise or other impairments introduced into data while it is transmitted from source to destination. Error detection ensures reliable delivery of data across vulnerable networks.
Error detection minimizes the probability of passing incorrect frames to the destination, known as undetected error probability.
Techopedia Explains Error Detection
The oldest method of error correction involves using parity. It works by adding an additional bit to each character word transmitted. The state of the bit is determined by a number of factors such as the type of parity and the number of logic-one bits in the data character.
Repetition code is another mechanism that relates to error detection. It is a coding schema that repeats bits across channels to achieve error-free communication. Data bits in a stream of data are divided into blocks of bits. Every block is transmitted a predetermined number of times. They are not as effective as parity, because the occurrence of errors in the same place leads to more problems. However, they are simple and used in the transmission of number stations.
Checksum is an error detection method that is a modular arithmetic sum of message code words of fixed word length. Checksum schemes involve longitudinal redundancy checks, parity bits and check digits.