What Does Gigabyte Mean?
A gigabyte (GB or GByte) is a data measurement unit for digital computer or media storage equal to one billion (1,000,000,000) bytes or one thousand (1,000) megabytes (MB). The unit of measurement in storage capacity that follows it is the terabyte (TB), which equals 1,000 GB.
The prefix “giga” equates to 109 in the International System of Units (SI) and comes from the Greek word γίγας (ghìgas) which means “giant.” The term was coined by Werner Buchholz in 1956 during the design of the first IBM transistorized supercomputer, the 7030 Stretch.
Gigabytes are normally used for measuring storage capacity, data transmission speed or random-access memory (RAM), with some important differences (see the explanation below).
Some of the most common “standard” sizes expressed in gigabytes include:
-
DVDs, which can hold 4.7 gigabytes of data
-
Single-layer Blu-rays, which can hold approximately 25 GB of data
-
Hard drives, which can hold several hundred GB of data (beyond 1,000 GB, the size is usually measured in terabytes)
-
SSD drives, which can hold 128, 256 or 512 GB of data
-
Optic fiber bandwidth speed, which can reach up to 1 GB per second download/upload speed
-
System and/or video RAM, which can be 1, 2, 4, 8, 16, or 32 GB
A gigabyte is also known as a gig.
Techopedia Explains Gigabyte
One byte contains eight bits comprised of a string of 0s and 1s. These eight bits can hold 256 values, ranging from 0 to 255. Generally, a byte is the number of bits used to encode a single text character.
According to the International System of Units (SI from the French: Système International d'Unités), the “giga” prefix refers to one billion (109 or 1,000,000,000) bytes. In computing terminology, a gigabyte might denote either a certain amount of physical memory (data storage capacity in hard drives and solid-state drives) or data transmission speed. However, it is also used in information technology to specify amounts of RAM memory, namely 10,243 bytes, leading to some confusion (and even lawsuits against drive manufacturers in the United States).
In fact, in IT and computer science, one GB equals 1,073,741,824 (10,243 or 230) bytes. In 2000, the Institute of Electrical and Electronics Engineers (IEEE) incorporated the International Electrotechnical Commission’s (IEC) formal approval of SI metric prefixes. For example, one GB is one billion bytes, and one kilobyte (KB) is one thousand bytes.
By late 1999, the IEC formally recommended that gibi, giga’s metric counterpart, replace the giga prefix. In the 21st century, the GB term is used according to the context. Most computer networking companies use the SI prefix gibi for gibibyte (GiB). When referencing the amount of available RAM, the binary prefix is used. Thus, one GB equals 10,243 bytes. When referencing disk storage, one GB equals approximately 109 bytes. Software and filing systems often categorize file size by using a binary and SI unit combination, such as GB or GiB.
Here are some practical examples of what one gigabyte of data translates into:
-
Roughly 10 yards of books on a shelf
-
More than 50,000 emails (with no attachments)
-
5 hours of video chat
-
350+ minutes of YouTube videos
-
250 songs on Spotify
-
Less than 8 minutes of Ultra HD videos or 1 hour at high resolution on a streaming platform
-
10 hours of Facebook
-
Approximately 250 10-megapixel photos or 300 photos uploaded to social media platforms
The price per GB of storage capacity varies depending on the manufacturer and type of storage media, but has consistently and significantly dropped over the years. From an average cost of $8 per GB of HDD memory in 2000, it dropped to $0.022 in 2020. It is even more incredibly to note how a single GB of memory was worth approximately $500,000 in 1981.