ALERT

[WEBINAR] The Future of Data and Analytics

Binary-Coded Decimal (BCD)

Definition - What does Binary-Coded Decimal (BCD) mean?

A binary-coded decimal (BCD) is a type of binary representation for decimal values where each digit is represented by a fixed number of binary bits, usually between four and eight.

The norm is four bits, which effectively represent decimal values 0 to 9. This writing format system is used because there is no limit to the size of a number. Four bits can simply be added as another decimal digit, versus real binary representation, which is limited to the usual powers of two, such as 16, 32 or 64 bits.

Techopedia explains Binary-Coded Decimal (BCD)

Binary-coded decimals are an easy way to represent decimal values, as each digit is represented by its own 4-bit binary sequence which only has 10 different combinations. By comparison, converting real binary representation to decimal requires arithmetic operations like multiplication and addition.

It is easier for conversion to decimal digits for display or printing, but the resulting circuit required to implement this system is more complex.For example, the binary coded decimal "1001 0101 0110," which has three groups of 4 bits, means there are three decimal digits. In order, from left to right, the resulting decimal value is 956.

The following are the 4-bit binary representation of decimal values:

0 = 0000
1 = 0001
2 = 0010
3 = 0011
4 = 0100
5 = 0101
6 = 0110
7 = 0111
8 = 1000
9 = 1001
Share this: