From Wikipedia

HomePage | Recent changes | View source | Discuss this page | Page history | Log in |

Printable version | Disclaimers | Privacy policy

A gigabyte is a unit of measurement in computers of approximately one thousand million (American billion) bytes.

Because of irregularities in definition and usage of the kilobyte, the exact number could be any one of the following:

  1. 1,073,741,824 bytes - 1024 times 1024 times 1024, or 230. This is 1024 times a megabyte. This is the definition used in computer science and computer programming
  2. 1000000000 bytes or 109 - this is the definition used by telecommunications engineers and storage manufacturers

See integral data type.

A terabyte is 1024 times an gigabyte.

In speech 'gigabyte' is often abbreviated to 'gig', as in "This hard drive has 10 gigs".

To clarify the meaning(1) above, the International Electrotechnical Commission (IEC), a standards body, in 1997 proposed short unions of the International System of Units (SI) prefixes with the word "binary." Thus meaning (1) would be called a gibibyte. (Gi) This naming convention has not been widely accepted.