Don't miss an insight. Subscribe to Techopedia for free.



What Does Gigabit Mean?

Gigabit (Gb) is a data measurement unit applied to digital data transfer rates (DTR) and download speeds. One Gb equals one billion (1,000,000,000 or 109) bits.


The International System of Units (SI) defines the giga prefix as a 109 multiplier for data storage, or one billion (1,000,000,000) bits. The binary giga prefix represents 1,073,741,824 (10243 or 230) bits. The SI and binary differential is approximately 4.86 percent.

Techopedia Explains Gigabit

Central processing units (CPU) are built with data control instructions for bits–the smallest data measurement unit. Bits are magnetized and polarized binary digits that represent stored digital data in random access memory (RAM) or read-only memory (ROM). A bit is measured in seconds and characterized by high-voltage 0 (on) or 1 (off) values.

Most networks apply the SI version of Gb when measuring modem, FireWire or Universal Serial Bus (USB) speeds, whereas the binary version of Gb rarely refers to DTR speed and measures RAM and fiber optic cable. Software groups and filing systems often combine binary and SI Gb units according to requirements.

In 2000, the Institute of Electrical and Electronics Engineers (IEEE) incorporated the International Electrotechnical Commission (IEC) formal approval of SI metric prefixes (for example, MB as one million bytes and KB as one thousand bytes). Newly added metric terms include:

  • Kibibyte (KiB) equals 1,024 bytes.
  • Mebibyte (MiB) equals 1,048,576 bytes.
  • Gibibyte (GiB) equals 1,073,741,824 bytes.

Related Terms