ALERT

Cisco CloudCenter: Get the Hybrid IT Advantage

Hexadecimal

Definition - What does Hexadecimal mean?

Hexadecimal is a base/positional number system used in mathematics and computer science. It has a base of 16 and uses 16 unique alpha-numeric symbols with the numbers zero to 9 to represent themselves and the letters A-F to represent the values 10 to 15.

Hexadecimal is an easier way to represent binary values in computer systems because they significantly shorten the number of digits, as one hexadecimal digit is equivalent to four binary digits.

Techopedia explains Hexadecimal

Hexadecimals are used heavily in computer science and digital electronics as a means of representing binary code as a human-readable form.

A single hexadecimal digit represents four binary bits called a nibble, which is half of an octet (8 bits). Although the number progression for hexadecimal begins with 0 and ends in F, the counting is still very much the same as in decimal, which means that when the last possible number system is reached, the place value is incremented to the left, and the current value becomes zero. Following this rule in decimal, after 09 is 10, and in the same vein for hexadecimal, after 0F is 10.

An example of hexadecimal counting using two digits is as follows:

00 01 02 ... 09 0A 0B 0C ... 0F 10 11 12 ... 19 1A 1B 1C ... 1F 20

Share this: