Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.

rxl125

macrumors regular
Original poster
Apr 15, 2006
244
0
Does that seem right? If I erased and started over would I get more? Where is that 1.8 gig at?
 
It's not exactly 16 GB (computers 'count different' as well (not whole numbers like 10000 but things like 10784 etc.), so it looks less). Next to that, the iPhone OS is on your iPhone... takes space as well :p

EDIT: See answer from Chris.L, he explains what I meant with the 'count different'-thingy.
 
Its 16,000,000 bytes.

16 gigabytes is actually 1,073,741,824 * 16 = 17,179,869,184 bytes.

16 million bytes divided by 1,073,741,824 is 14.9 billion bytes, or close to 15 GB.

Then the iPhone OS as mentioned and there is your answer
 
To the OP, the next time you buy any device, a computer, bare hard drive, usb stick . . . check how much actual memory you're getting compared to what it says on the box . . . think of it like when you got your first paycheck and saw the "net pay" amount versus "gross." :D
 
Who else is glad Snow Leopard counts the same way as hard disk boxes? These questions will slowly disappear over time. :)
 
To be more precise, 16 GB _is_ 16,000,000 bytes. :)

1 GB in the software industry is still referred to as 2^30 bytes, but that is now officially defined as 1 GiB ("gibibiyte").

http://en.wikipedia.org/wiki/GiB

True, I should've put 'A true 16gigbytes is...'

Speaking of the difference, I also read this on scan.co.uk the other day

You might have seen the terms GiB, MiB and KiB around HEXUS in recent months and elsewhere on the Internet, when talking about memory or data sizes. If you've hung around computing for any length of time, GB, MB and KB are what you'll have learned to use when talking about things like system memory, hard disks and processor caches, so why do we use anything different?

The answer lies in the correct representation of binary numbers. When talking about a binary number like computer memory or processor cache, where the memory is made up of a series of memory cells that hold single bits of information, the number of memory cells is always power of 2. For instance, 1024 bits of memory, what you'd likely usually call a kilobit, is 2^10 bits. However, kilo is a prefix for base-10 or decimal numbers, so it doesn't actually apply to that figure when it's a representation of a binary number. The correct prefix is instead kibi, so 1024 bits is really a kibibit.

That rule applies everywhere. So what you'd usually think of as 1GB, or 1 gigabyte, isn't 1,000,000,000 bytes. Giga is the decimal prefix, so when you say gigabyte it means 1,000,000 bytes, not the 1,073,741,824 bytes it actually is (1024 * 1024 * 1024, all binary representations). Gibibyte is the correct term, abbreviated as 1GiB. So you have 1GiB of system memory, not 1GB.

The most common exception, where it's very correct, is the hard disk, where 1GB of disk space does actually mean 1,000,000,000 bytes. That's why for every 1GB of hard disk space, you actually see around 950MiB of space in your operating system (regardless of whether the OS tells you that's MB, which it isn't!).

Apply the bit/byte difference with lowercase b for bit and uppercase B for byte, and you should be set for writing the correct numeric suffix depending on what you're talking about.

Just remember it's MiBs of memory, KiBs of cache but GBs of disk space and you'll be fine. For the full list of prefixes for binary multiples, check out this handy page. Just think hard about the number you're talking about, decide whether it's a binary or decimal representation of a number, then choose the right suffix when writing it down, or prefix when describing the units you're using. 10GiB is 10 gibibytes, as an example.

When talking about hardware, we strive to be as technically correct as we have the ability to be, so don't be confused when you see KiBs and MiBs. Just refer back to this article when you get in a muddle or aren't quite sure what's the correct suffix to use for the number you want to write.
 
True, I should've put 'A true 16gigbytes is...'

Speaking of the difference, I also read this on scan.co.uk the other day

You might have seen the terms GiB, MiB and KiB around HEXUS in recent months and elsewhere on the Internet, when talking about memory or data sizes. If you've hung around computing for any length of time, GB, MB and KB are what you'll have learned to use when talking about things like system memory, hard disks and processor caches, so why do we use anything different?

The answer lies in the correct representation of binary numbers. When talking about a binary number like computer memory or processor cache, where the memory is made up of a series of memory cells that hold single bits of information, the number of memory cells is always power of 2. For instance, 1024 bits of memory, what you'd likely usually call a kilobit, is 2^10 bits. However, kilo is a prefix for base-10 or decimal numbers, so it doesn't actually apply to that figure when it's a representation of a binary number. The correct prefix is instead kibi, so 1024 bits is really a kibibit.

That rule applies everywhere. So what you'd usually think of as 1GB, or 1 gigabyte, isn't 1,000,000,000 bytes. Giga is the decimal prefix, so when you say gigabyte it means 1,000,000 bytes, not the 1,073,741,824 bytes it actually is (1024 * 1024 * 1024, all binary representations). Gibibyte is the correct term, abbreviated as 1GiB. So you have 1GiB of system memory, not 1GB.

The most common exception, where it's very correct, is the hard disk, where 1GB of disk space does actually mean 1,000,000,000 bytes. That's why for every 1GB of hard disk space, you actually see around 950MiB of space in your operating system (regardless of whether the OS tells you that's MB, which it isn't!).

Apply the bit/byte difference with lowercase b for bit and uppercase B for byte, and you should be set for writing the correct numeric suffix depending on what you're talking about.

Just remember it's MiBs of memory, KiBs of cache but GBs of disk space and you'll be fine. For the full list of prefixes for binary multiples, check out this handy page. Just think hard about the number you're talking about, decide whether it's a binary or decimal representation of a number, then choose the right suffix when writing it down, or prefix when describing the units you're using. 10GiB is 10 gibibytes, as an example.

When talking about hardware, we strive to be as technically correct as we have the ability to be, so don't be confused when you see KiBs and MiBs. Just refer back to this article when you get in a muddle or aren't quite sure what's the correct suffix to use for the number you want to write.
Oh FFS! All that just because HDD manufacturers can't be bothered to use the same system to describe capacity as the rest of the computer industry. Why do we have to confuse people by introducing new terms? For as long as computers have been around 1024 bytes=1K, 1024K=1MB and 1024MB=1GB. It's only HDD manufacturers who insist on using a different standard. :rolleyes:
 
Oh FFS! All that just because HDD manufacturers can't be bothered to use the same system to describe capacity as the rest of the computer industry. Why do we have to confuse people by introducing new terms? For as long as computers have been around 1024 bytes=1K, 1024K=1MB and 1024MB=1GB. It's only HDD manufacturers who insist on using a different standard. :rolleyes:

Agreed.
 
You forget half the world is using Windows.

Unfortunately.

Oh FFS! All that just because HDD manufacturers can't be bothered to use the same system to describe capacity as the rest of the computer industry. Why do we have to confuse people by introducing new terms? For as long as computers have been around 1024 bytes=1K, 1024K=1MB and 1024MB=1GB. It's only HDD manufacturers who insist on using a different standard. :rolleyes:

Well, it's the marketing who's got the control nowadays. So the 1024 should be definitely ditched and the 1000 should be made a standard.
 
Well, it's the marketing who's got the control nowadays. So the 1024 should be definitely ditched and the 1000 should be made a standard.

but its only 1024 because of the physical makeup of the hardware... do you just want the industry to be able to lie with a clean conscience :confused:
 
As there is a new thread re-hashing the discussion of this question at least once every two weeks, and this question has been answered in detail by multiple posters already, I'm closing this thread.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.