Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

McGiord

macrumors 601
Original poster
Oct 5, 2003
4,558
290
Dark Castle
I just noticed Today that the 16GB only has 14.6GB available :mad:

Where are the missing 1.4 GB?

The 8GB only has 7.3GB...
 
Unfortunately 16GB no longer means 16GB, it now means 16 billion bytes and when this number is converted to how a GB is truly calculated it reduces the actual number. If you aren't familiar with this calculation, IIRC, they are as follows:

1KB = 1,024 bytes
1MB = 1,024 KB
1GB = 1,024 MB

So a GB is really 1,073,741,824 bytes and a 16GB device (if it were truly 16GB) would have 17,179,869,184 bytes.

Unfortunately this has become common throughout the computer industry and you will now see on HDD that it will say "250GB is 250 billion bytes," or something to that effect.

In addition to that you have the OS which is going to take up room as well further reducing the capacity.

Hope that helps.
 
Unfortunately 16GB no longer means 16GB, it now means 16 billion bytes and when this number is converted to how a GB is truly calculated it reduces the actual number. If you aren't familiar with this calculation, IIRC, they are as follows:

1KB = 1,024 bytes
1MB = 1,024 KB
1GB = 1,024 MB

So a GB is really 1,073,741,824 bytes and a 16GB device (if it were truly 16GB) would have 17,179,869,184 bytes.

Unfortunately this has become common throughout the computer industry and you will now see on HDD that it will say "250GB is 250 billion bytes," or something to that effect.

In addition to that you have the OS which is going to take up room as well further reducing the capacity.

Hope that helps.


Whats the problem? They say exactly what you get, you get 250 Gigabyte, Giga = 1,000,000,000. So you get 250 000 000 000 bytes.
 
Whats the problem? They say exactly what you get, you get 250 Gigabyte, Giga = 1,000,000,000. So you get 250 000 000 000 bytes.

Yes "giga" = 1,000,000,000 but a gigabyte as a standard is a binary number and is 1024^3 which is what I said above. Are companies being deceptive by doing this? Yes and no. They are not following the historical standards for the term, nor are they following the current standards of the International Electrotechnical Commission (IEC) , the Institute of Electrical and Electronics Engineers (IEEE), the International Committee for Weights and Measures (CPIM) and the National Institute of Standards and Technology (NIST).

So yes, when they say a drive is 250GB, you are getting 250,000,000,000 bytes but you are NOT getting 250GB as defined but the above organizations nor as understood by anyone who has been in the computer industry for any length of time.

You can find more here: http://en.wikipedia.org/wiki/Gigabyte
 
Yes "giga" = 1,000,000,000 but a gigabyte as a standard is a binary number and is 1024^3 which is what I said above. Are companies being deceptive by doing this? Yes and no. They are not following the historical standards for the term, nor are they following the current standards of the International Electrotechnical Commission (IEC) , the Institute of Electrical and Electronics Engineers (IEEE), the International Committee for Weights and Measures (CPIM) and the National Institute of Standards and Technology (NIST).

So yes, when they say a drive is 250GB, you are getting 250,000,000,000 bytes but you are NOT getting 250GB as defined but the above organizations nor as understood by anyone who has been in the computer industry for any length of time.

You can find more here: http://en.wikipedia.org/wiki/Gigabyte

How are they not following standards? They use the SI standard, which is the most important of all standards. GB = Gigabyte = 1,000,000,000 bytes. Then if people are too stupid to know their SI standards, too bad. Its common knowledge that everyone knows. People bitching about this should grow up and go back to highschool so they might actually learn something.
 
How are they not following standards? They use the SI standard, which is the most important of all standards. GB = Gigabyte = 1,000,000,000 bytes. Then if people are too stupid to know their SI standards, too bad. Its common knowledge that everyone knows. People bitching about this should grow up and go back to high school so they might actually learn something.

Wait till drives start getting even bigger and people find that on a 1TB drive, they only get about 930GB of space. This always cracks me up how people feel cheated.
 
How are they not following standards? They use the SI standard, which is the most important of all standards. GB = Gigabyte = 1,000,000,000 bytes. Then if people are too stupid to know their SI standards, too bad. Its common knowledge that everyone knows. People bitching about this should grow up and go back to highschool so they might actually learn something.

Huh? People should go back to high school to learn something that has only changed in the last 5 - 10 years? For decades drives were measured and marketed based on the measurement I noted above.

This has nothing to do with "learning something" other than learning that computer and HDD manufacturers figured out that they could advertise their wares as having more capacity than they would have been historically recognized as having. It's called marketing. In addition to that, the whole idea of reporting capacity this way caters to those who don't understand the binary calculation.

I'm not complaining about this as I have understood from the day it started happening how and why this was occurring. But I think it is weak on the part of manufacturers to cater to the lowest common denominator especially as long as operating systems continue to recognize drive capacity in binary.
 
It would be nice if OS X would report space using the IEC spec so iTunes and such would report 14.6GiB and the device box could continue to display the SI spec space of 16GB.

Scratch that, pick one and stay consistent. Post on the product box the binary space or alter the OS to report the decimal value... neither will happen.

Besides, if they did this we would start to see threads on "What is the difference between 16GB and 16GiB?"
 
I understand the 1,024 scale, but why the OS in the 8 GB takes 0.7GB and the 16 one doubles that to 1.4 GB?
 
I understand the 1,024 scale, but why the OS in the 8 GB takes 0.7GB and the 16 one doubles that to 1.4 GB?

Ok, let's try this and see if it helps. I have a 32gb iPod Touch as well as a 16gb iPhone.

First translate the space on the devices from the marketing or SI GB to the IEC GiB or decimal to binary. Then there is the space reported in iTunes.

32GB iPod Touch = 29.8GiB = 29.8GiB available in iTunes (0.0GiB)
16GB iPhone = 14.9GiB = 14.6GiB available in iTunes (-0.3GiB)
8GB iPhone = 7.5GiB = 7.3GiB available in iTunes (-0.2GiB)

You will notice that between the 8 and 16 there is a drop of 0.1GiB. There is for some reason, no drop for the iPod. So, based on this there is a 0.1GiB difference or 1MiB.

Based on this, I don't see a real difference in storage between the two phones.
 
Huh? People should go back to high school to learn something that has only changed in the last 5 - 10 years? For decades drives were measured and marketed based on the measurement I noted above.

This has nothing to do with "learning something" other than learning that computer and HDD manufacturers figured out that they could advertise their wares as having more capacity than they would have been historically recognized as having. It's called marketing. In addition to that, the whole idea of reporting capacity this way caters to those who don't understand the binary calculation.

I'm not complaining about this as I have understood from the day it started happening how and why this was occurring. But I think it is weak on the part of manufacturers to cater to the lowest common denominator especially as long as operating systems continue to recognize drive capacity in binary.

Of course it has something to do with learning something. Giga has been the SI prefix for 1,000,000,000 since 1960. So if people cannot even learn SI-prefixes that has been in use since 1960 they really should go back to school.
 
To the thread creator, you really need to understand how storage & OS's work together.

Everyone on here has pretty much got it 99% answered it for you, with the exception of those who have replied with "I still don't get it why has Apple cheated me out - why, oh why???".
 
Of course it has something to do with learning something. Giga has been the SI prefix for 1,000,000,000 since 1960. So if people cannot even learn SI-prefixes that has been in use since 1960 they really should go back to school.

And the binary form of measuring data (2, 4, 8 16, 32, 64, 128, 256, 512, 1024....) has been in use for just as long and until recently was how mediums of storage were marketed (i.e. a 250 GB hard drive was actually 268,435,456,000 bytes not 250,000,000,000) so don't act like this has been the norm in the computer or telecommunications industry because it has not.

And to the question by THEBEARMAN of how much space is actually on a 16GB iPhone vs. a 32GB iPod Touch, your calculations are incorrect.

8GB = 7.451GiB (I don't know if I can bring myself to calling a real GB a GiB)
16GB = 14.901GiB
32GB = 29.802GiB

I don't have an iPod Touch so I will take your word for it that it is reporting 29.8GiB. If that is the case than there is a discrepancy between the iPhones and the iPod Touch in that the iPod Touch seems to recognize the all of the available space whereas the iPhones show .15GiB less for the 8GB model and .3 GiB less for the 16GB model.

I am not quite sure why the discrepancy between the two iPhones, there could be many reasons, but I would guess the difference between the Touch and the iPhone has to do with how Apple may partition the space and the fact that you can enable disc mode on the Touch and not on the iPhones (but I am just guessing on that).
 
And the binary form of measuring data (2, 4, 8 16, 32, 64, 128, 256, 512, 1024....) has been in use for just as long and until recently was how mediums of storage were marketed (i.e. a 250 GB hard drive was actually 268,435,456,000 bytes not 250,000,000,000) so don't act like this has been the norm in the computer or telecommunications industry because it has not.

Who cares if its a norm or not. Giga means 1,000,000,000. So saying 250 Gigabyte is correct, people will know they are not getting 250 GiB so what is the problem?
 
Who cares if its a norm or not. Giga means 1,000,000,000. So saying 250 Gigabyte is correct, people will know they are not getting 250 GiB so what is the problem?

Ummmm, clearly they don't otherwise this thread and others like it wouldn't exist. Again, as I have said before I am not complaining. Other people are because they don't understand and quite frankly, no offense intended, I'm not sure you do otherwise you wouldn't be saying things like

Giga has been the SI prefix for 1,000,000,000 since 1960. So if people cannot even learn SI-prefixes that has been in use since 1960 they really should go back to school.

Giga in terms on computers and telecommunications has not meant 1,000,000,000 since the 1960's. I don't know what school you went to, but the multiple computer, IT and IS classes I took while obtaining my degree all taught me that 1GB = 1024MB etc., etc., etc. In fact, when it comes to other parts of the computer (memory) 1GB still means 1024MB.

It's not that people don't understand that giga = 1,000,000,000. What they don't understand is why their computers show that a HDD that is advertised as being 16GB only shows up as 14.9GB. This is what leads to people's confusion and the feeling they have been misled. I think it is somewhat deceptive, but then again, what marketing isn't deceptive in some way. Companies will always have marketing that puts the best light on their product.

Unfortunately, there doesn't appear to be an easy resolution to the discrepancies. Either OS's and other software have to be rewritten to convert storage capacities and file sizes to correspond to the SI standard or component manufacturers need to start showing capacities in the standards as set by the IEC, IEEE, CPIM and NIST. I don't see either thing happening anytime soon.
 
1000^3 / 1024^3 = 0.9313225746154785 Gibibyte for every Gigabyte

Physical space is displayed in Gibibytes (SI) because it appears larger while it's actually measured in Gigabytes (binary).
 
1000^3 / 1024^3 = 0.9313225746154785 Gibibyte for every Gigabyte

Physical space is displayed in Gibibytes (SI) because it appears larger while it's actually measured in Gigabytes (binary).

Maybe I am misreading your post but I think you got it backwards. The IEC has recommended that what has traditionally been known as a gigabyte (1024^3) now be called a gibibyte, a contraction of giga binary byte. So it would actually be 1000^3 / 1024^3 = 0.9313225746154785 Gigabytes for every Gibibyte.

If you look into this further, the IEC is now recommending the following:

1024 bytes = 1 kibibyte (KiB)
1024 KiB = 1 mebibyte (MiB)
1024 MiB = 1 gibibyte (GiB)
1024 GiB = 1 tebibyte (TiB)
1024 TiB = 1 pebibyte (PiB)
1024 PiB = 1 exbibyte (EiB)
1024 EiB = 1 zebibyte (ZiB)
and 1024 ZiB = yobibyte (YiB)

It all seems pretty silly to me that they want to change it.
 
Ummmm, clearly they don't otherwise this thread and others like it wouldn't exist. Again, as I have said before I am not complaining. Other people are because they don't understand and quite frankly, no offense intended, I'm not sure you do otherwise you wouldn't be saying things like



Giga in terms on computers and telecommunications has not meant 1,000,000,000 since the 1960's. I don't know what school you went to, but the multiple computer, IT and IS classes I took while obtaining my degree all taught me that 1GB = 1024MB etc., etc., etc. In fact, when it comes to other parts of the computer (memory) 1GB still means 1024MB.

It's not that people don't understand that giga = 1,000,000,000. What they don't understand is why their computers show that a HDD that is advertised as being 16GB only shows up as 14.9GB. This is what leads to people's confusion and the feeling they have been misled. I think it is somewhat deceptive, but then again, what marketing isn't deceptive in some way. Companies will always have marketing that puts the best light on their product.

Unfortunately, there doesn't appear to be an easy resolution to the discrepancies. Either OS's and other software have to be rewritten to convert storage capacities and file sizes to correspond to the SI standard or component manufacturers need to start showing capacities in the standards as set by the IEC, IEEE, CPIM and NIST. I don't see either thing happening anytime soon.

I really don't see the problem, there is a world outside the computer you know. Giga has always meant 1,000,000,000. It's not a computer term, it just means 10^9. Whats wrong is that they say 1 GB ram when it really is 1024 MB. When they start mixing it can be confusing.
 
Tuff - Thanks. My apps were behaving like the 2g Jailbroken phone if the partition was full. Instant crash. I resolved this by sync'ing with another computer's iTunes.
 
You guys need to give it up - why is there still a topic on this matter?

Here's a better idea, go post this subject on every forum related to hard drives and computers and see what happens...

facepalm.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.