Color is, as any descent photographer can tell you, very subjective. The camera that took the picture of those iPhones has a white balance (even if it's automatic) and other features that affect how it represents colors (dark background vs light background in JPG compression, etc...).
Color, as any hardware store can tell you, can be calibrated to match, within a certain tolerance.
Tolerance is the issue being drummed heavily here. How closely should one screen be expected to match another? I can assure you that Apple is well aware of the tolerances on their screens, including iPhone screens.
Reasons may exist to explain why Apple would be willing to release screens with such a wide range of tolerance. For instance, they may have wanted to make sure that they didn't sell out everywhere and miss out on revenue. Like it was mentioned above, the quality is still good, it's just not equal across the board.
Lesser quality on all phone might have been a good way to even the playing field w/o having supply issues. It's easier to make a great screen look like a good one than to make a new screen to replace a good one, I'm glad they didn't go that route (I don't think they did anyway).
Maybe it was a temporary issue that Apple has already resolved. We don't have any official word on this do we?
Maybe it's a fluke? It's possible this is an example of a screen that is beyond the tolerance specks and just got through QA on a Friday or Monday (bad days for manufacturing anywhere). Again, we don't have any way to know the tolerance range. I also suspect that we don't have any way to test it (as opposed to eyeballing it).
Wm
http://www.ablemac.com