Gigabyte (G or GByte)
Definition - What does Gigabyte (G or GByte) mean?
Gigabyte (GB or GByte) is a data measurement unit for digital computer or media storage. One GB equals one billion (1,000,000,000) bytes or one-thousand (1,000) megabytes (MB).
Techopedia explains Gigabyte (G or GByte)
One byte contains eight bits comprised of a string of 0s and 1s, which hold 256 values ranging from 0 to 255. Generally, a byte is the number of bits used to encode a single text character.
According to the International System of Units (SI from the French: Système International d'Unités), the giga prefix refers to one-billion (109 or 1,000,000,000) bytes. In IT and computer science, a GB equals 1,073,741,824 (10243 or 230) bytes. In 2000, the Institute of Electrical and Electronics Engineers (IEEE) incorporated the International Electrotechnical Commission's (IEC) formal approval of SI metric prefixes. For example, one GB is one billion bytes, and one kilobyte (KB) is one thousand bytes.
By late 1999, the IEC formally recommended that gibi, giga’s metric counterpart, replace the giga prefix.
In the 21st century, the GB term is used according to the context. Most computer networking companies use the SI prefix gibi for gibibyte (GiB). When referencing the amount of available RAM, the binary prefix is used. Thus, one GB equals 10243 bytes. When referencing disk storage, one GB equals approximately 109 bytes.
Software and filing systems often categorize file size by using a binary and SI unit combination, such as GB or GiB.
Why Traditional Database Technology Fails to Scale
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: