The card manufacturers will tall you 1 GB = one billion bytes (10 to the power of 9 bytes, or 1000 million bytes). Your computer will typically show this same card as less than 1 GB, because your computer typically considers 1 GB = 2 to the power of 30 bytes, which is 1.07 times 10 to the power of 9. Then, you lose some space on the card due to formatting (the filesystem on the card needs some space) so when all is said and done, your computer will show a 1 GB card as being nine hundred and something megabytes.
The card manufacturers will tall you 1 GB = one billion bytes (10 to the power of 9 bytes, or 1000 million bytes). Your computer will typically show this same card as less than 1 GB, because your computer typically considers 1 GB = 2 to the power of 30 bytes, which is 1.07 times 10 to the power of 9. Then, you lose some space on the card due to formatting (the filesystem on the card needs some space) so when all is said and done, your computer will show a 1 GB card as being nine hundred and something megabytes.
how many bytes MAKE. 1GIGA