Error Detection

Definition - What does Error Detection mean?

In networking, error detection refers to the techniques used to detect noise or other impairments introduced into data while it is transmitted from source to destination. Error detection ensures reliable delivery of data across vulnerable networks.

Error detection minimizes the probability of passing incorrect frames to the destination, known as undetected error probability.

Techopedia explains Error Detection

The oldest method of error correction involves using parity. It works by adding an additional bit to each character word transmitted. The state of the bit is determined by a number of factors such as the type of parity and the number of logic-one bits in the data character.

Repetition code is another mechanism that relates to error detection. It is a coding schema that repeats bits across channels to achieve error-free communication. Data bits in a stream of data are divided into blocks of bits. Every block is transmitted a predetermined number of times. They are not as effective as parity, because the occurrence of errors in the same place leads to more problems. However, they are simple and used in the transmission of number stations.

Checksum is an error detection method that is a modular arithmetic sum of message code words of fixed word length. Checksum schemes involve longitudinal redundancy checks, parity bits and check digits.

Posted by:
How Can Analytics Improve Business Free Webinar

Connect with us

Techopedia on Linkedin
Techopedia on Linkedin
Tweat cdn.techopedia.com
Techopedia on Twitter


'@Techopedia'
Sign up for Techopedia's Free Newsletter!
Techwise Webinar Series
How Can Analytics Improve Business?
Register for this episode of TechWise to learn from two of the most experienced analysts in the business: Dr. Robin Bloor, Chief Analyst of The Bloor Group, and Dr. Kirk Borne, Data Scientist, George Mason University.

Email Newsletter

Join 138,000+ IT pros on our weekly newsletter