In the measurement units system, the prefix "kilo" is to multiply by 1,000, "mega" to multiply by 1,000,000 end "giga" to multiply by 1,000,000,000.

These are powers of 10:

- 1,000 is 10 to the power 3,
- 1,000,000 is 10 to the power 6
- and 1,000,000,000 is 10 to the power 9.

Also, 1,000,000 = 1,000 x 1,000 and 1,000,000,000 = 1,000 x 1,000,000, so that a mega is 1,000 kilo and a giga is 1,000 mega.

But in usual computer science, it is not so: a kilobyte is 1,024 bytes rather than 1,000 bytes.

This is because in computer science, the numeration is done in 2 basis ('0' and '1' are the only digits) rather than the 10 basis, so that the powers of 2 are preferred to the powers of 10.

And 1,024 is a power of 2, it is 2 to the power 10. More precisely, it happens to be a power of 2 that is close to 10 to the power 3: that's why it is used for the definition of a kilobyte.

The convention is the same for the megabyte, that is 1,024 kilobytes, or 1,024 x 1,024 = 1,048,576 bytes. The multiplier is also a power of 2, it is 2 to the power 20, that is close to 10 to the power 6, or 1 million.

And the gigabyte is 1,024 megabytes, that is 1,024 x 1,024 x 1,024 = 1,073,741,824 bytes. The number of bytes in a gigabyte is a power of 2, it is 2 to the power 30, that is close to 10 to the power 9, or 1 billion.

Note that the official standard definition of a kilobyte is indeed 1,000 bytes, the usual definition as 1,024 bytes is standardized as a "kibibyte".

But this is the official definition only, not the usual one… except for sellers of memory devices, for whom the official definition gives a bigger amount of memory: an "official" gigabyte is 7.37% more than a "regular" one…