Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well, some people have a keener eye than others apparently.

Sigh. Yes, me, the guy who writes computer graphics code, does not have a keen eye. You caught me.

iTunes is 1080p. It IS HD. If it looks like SD to you, something is either wrong with your eyes or your setup.
 
Sigh. Yes, me, the guy who writes computer graphics code, does not have a keen eye. You caught me.

iTunes is 1080p. It IS HD. If it looks like SD to you, something is either wrong with your eyes or your setup.
Exactly.

I mean, I can understand someone seeing slight compression artifacts on scenes with heavy motion or something, but to say that 1080p iTunes rentals look like SD is just plain absurd.
 
Sigh. Yes, me, the guy who writes computer graphics code, does not have a keen eye. You caught me.

iTunes is 1080p. It IS HD. If it looks like SD to you, something is either wrong with your eyes or your setup.

I said it looks SD relative to the pristine quality of the Blu-Ray. Not that it looks SD completely objectively.

These arguments are pointless though, as we are all in front of different screens, at different distances. I'm getting the feeling that the 30" ACD is one of the best screens out there for accentuating the differences between DVD, iTunes HD and Blu-Ray.

I don't care if iTunes is technically HD. It's clearly inferior to Blu-Ray. And I'm not saying this as an argument, I'm not trying to present evidence for it. I am Reporting what I am seeing right before my eyes on my screen.
 
the resolution of the encode is not the only factor that affects quality- the codecs and bitrate used can make a big difference, as can the actual settings used with the codec at the encode time. in this case, the bitrate is the main differentiating factor since both formats use the same codec (h.264) and resolution (1080p).
bluray video generally averages above 20Mbps and can spike up to 50 Mbps when the codec needs to.
itunes 1080p video is capped at 8Mbps.

whether or not you can see the difference is another question, but the chart from earlier in the thread suggests that most viewing setups should expose the differences between 720p and 1080p.
the difference between bluray and 1080p itunes files will be visible in some setups, but not nearly as many.

in my setup (61" TV about 6 feet away from the viewing positions), I can see differences in detail between 1080p itunes and blu ray, but they're not huge. itunes seems to blow out highlights and I often see banding in color gradients that aren't there on blu ray, though. and the sound isn't even close- blu ray wins that easily.

still, itunes isn't bad, especially since they switched to 1080p h.264 encodes.
 
the resolution of the encode is not the only factor that affects quality- the codecs and bitrate used can make a big difference, as can the actual settings used with the codec at the encode time. in this case, the bitrate is the main differentiating factor since both formats use the same codec (h.264) and resolution (1080p).
bluray video generally averages above 20Mbps and can spike up to 50 Mbps when the codec needs to.
itunes 1080p video is capped at 8Mbps.

However, iTunes is a different H.264 profile, so the iTunes encoding is much better per Mbps.

http://arstechnica.com/apple/2012/03/new-itunes-1080p-looks-good-through-better-h264-compression/
 
Apple is certainly using state of the art compression capabilities because they are in a race against reality to prove to themselves that physical media is already dead.

But I think the issue at the core of disagreements of the discrepancy in quality really have a lot to do with viewing on a TV vs. a high resolution computer screen. I've said it before, and I'll say it again.. sitting near to a 30" ACD, the difference is drastic.

If any of you who think it's impossible to see the difference look at them side by side on a 30" ACD and then still claim that the difference is not visible, I'd be very interested to hear about it.

Also, I suspect there are some people who were contemplating doing what I did, but ended up amassing a large film collection in iTunes-HD instead, and so they will deny a major difference to the ends of the earth for the sake of their own happiness.
 
I don't think anyone here is really debating that there is no difference at all, but rather, that the differences aren't big enough for many of them to trump the convenience factor. I'm not quite in the camp yet with video, but a lot of people are. And Apple's entire distribution model is BUILT on this.

We live in a world now of Apple TVs and Rokus. Everything from the cloud... :p


The same thing happened with music years ago. Heck, I don't even remember the last time I put a CD in my car.
 
I don't think anyone here is really debating that there is no difference at all, but rather, that the differences aren't big enough for many of them to trump the convenience factor. I'm not quite in the camp yet with video, but a lot of people are. And Apple's entire distribution model is BUILT on this.

We live in a world now of Apple TVs and Rokus. Everything from the cloud... :p


The same thing happened with music years ago. Heck, I don't even remember the last time I put a CD in my car.

Well, some people here Are debating that there is no difference at all.. that's what elicited my agitated reaction.

In any case, we'll see what the future holds. People might realize they just don't enjoy the system of everything in the cloud, and that it's more secure and enjoyable for them to utilize physical media. We are still in the very first sliver of the digital age, and of course there will be a period of online streaming if it is made available, but that doesn't mean it will survive long term.
 
Some people like to buy and read physical books to keep on a shelf. I like having a physical copy of a film that I can display on a shelf as well.

Personally, I don't care for the convergence of TV and computing/internet. My new 50" plasma was also heaps cheaper because it didn't have all that extra crap on it.

I work on my computer round the clock. When I sit down to watch a movie or TV show I want to be as far away from one as I can. Browsing DVDs and Blurays on a shelf is also more fun than looking through a playlist or folder. Unless you're too lazy to get your butt out of your chair.
 
Some people like to buy and read physical books to keep on a shelf. I like having a physical copy of a film that I can display on a shelf as well.

Personally, I don't care for the convergence of TV and computing/internet. My new 50" plasma was also heaps cheaper because it didn't have all that extra crap on it.

I work on my computer round the clock. When I sit down to watch a movie or TV show I want to be as far away from one as I can. Browsing DVDs and Blurays on a shelf is also more fun than looking through a playlist or folder. Unless you're too lazy to get your butt out of your chair.

These are just a few of the many reasons why physical media very well might prevail in the end. I am with you on every one, plus more.

Edit: Just for the hell of it, I predict the successor to Blu-Ray, which will contain films at 4K resolution, will be in the early-adoption phase by 2020.
 
Last edited:
Sigh. Yes, me, the guy who writes computer graphics code, does not have a keen eye. You caught me.

iTunes is 1080p. It IS HD. If it looks like SD to you, something is either wrong with your eyes or your setup.

you forget that the info travels on a very long path from the apple server to your computer. countless tiny distortions occur inbetween A to H. The path of blu ray in your home to your tv is far shorter. much less chance of digital jitter. I suspect if you were watching the same movie while in apple's server location the difference would be harder to detect.

BTW how do you know that the one with bad eyes is the op and not yourself? You don't. I have partial colorblindness I also have implants in both eyes for cataracts. My vision for distance is 20-12 yeah that is right. I see distant objects better then 99% of the human race, but ask me if the far off traffic light that I see is red or yellow. I can only guess as my colorblindness makes it tough for me to tell.

When I drive my car and there is a flashing yellow in the distance I see it faster then 99% of the worlds population. However I have to treat it as if it is flashing red since I am color blind in that color range. For all you know the op may have great color vision. There are various colorblindness tests and he may score really well when compared to you.

I remember driving with a cousin on a highway a few years ago. I told him stop there is a car fire ahead. He needed to drive a few hundred feet to see the fire as did all the others in the car.
 
Last edited:
you forget that the info travels on a very long path from the apple server to your computer. countless tiny distortions occur inbetween A to H. The path of blu ray in your home to your tv is far shorter. much less chance of digital jitter.

Digital is digital, there is no loss of quality by sending the bits a long way, look at those mars lander photos and video, the hubble telescope.... The bits are transferred (downloaded) to your local player, either via a disk or file, and then sent a short distance to your monitor.

The video distributer has a wide variety of compression choices to save space and bandwidth, however, and there is where many digital artifacts are generated, with their differences in look an feel (many others are generated in the monitor itself). 1080p from different intermediary sources are not necessarily the same. BR is also compressed to fit on the disc and the distributer has several choices available. Artifacts are introduced by the video camera used. You can see the difference between look and feel between the various OTA broadcasts, each having their favorite video compression which have a different look. Given it has the same master, the video from a 10GB BR disk may have more artifacts than the 10GB video file on your hard drive, however that 10GB file got there. There is nothing analog, once the camera captures the scene, until it gets to the monitor, its not the same as vinyl records. Sony likes to market BR as the best thing since sliced bread, but its just marketing, its well known the video results are full of flaws. Life is a compromise, so just how many flaws are you willing to live with?

Now the rest of the family could care less (as well as a vast majority of viewers), but I'm a technocrat and enjoy those crisp lines and deep color that a quality HD Video can bring out regardless of source. You can find plenty of poor quality video on BR out there too, and the sound is generally terrible. Very few releases are close to my view of pristine. Garbage in is garbage out no matter how high quality the bits are. My 60 inch kuro plasma brings out video flaws and separates the men from the boys nicely, things you are not going to see on small 50 inch screens. Now don't get me going about how monitors destroy quality video, and how most so called video purists have there monitors set to distort the video, just because the video feels better to them....

Its nice to have choices... and then you get into choices of CODECs... on and on.. Instead of burning discs, I save the HD footage I shoot to a iTunes library and play via my AppleTV, very nicely. I think alternate methods to distribute video will get better and soon will supplant BR, but there is marketing, egos, politics, and business cases.... and DRM. Most of the reason for BR and HDMI may be the anti-piracy features.
 
Last edited:
Digital is digital, there is no loss of quality by sending the bits a long way, look at those mars lander photos and video, the hubble telescope.... The bits are transferred (downloaded) to your local player, either via a disk or file, and then sent a short distance to your monitor.

The video distributer has a wide variety of compression choices to save space and bandwidth, however, and there is where many digital artifacts are generated, with their differences in look an feel (many others are generated in the monitor itself). 1080p from different intermediary sources are not necessarily the same. BR is also compressed to fit on the disc. Artifacts are introduced by the video camera used. You can see the difference between look and feel between the various OTA broadcasts, each having their favorite video compression which have a different look. Given it has the same master, the video from a 10GB BR disk may have more artifacts than the 10GB video file on your hard drive, however that 10GB file got there. There is nothing analog until it gets to the monitor, its not the same as vinyl records. Sony likes to market BR as the best thing since sliced bread, but its just marketing, its well known the video results are full of flaws. So just how many flaws are you willing to live with?

Now the rest of the family could care less (as well as a vast majority of viewers), but I'm a technocrat and enjoy those crisp lines and deep color that a quality HD Video can bring out regardless of source. You can find plenty of poor quality video on BR out there too, and the sound is generally terrible. Very few releases are close to my view of pristine. Garbage in is garbage out no matter how high quality the bits are. My 60 inch kuro plasma brings out video flaws and separates the men from the boys nicely, things you are not going to see on small 50 inch screens. Now don't get me going about how monitors destroy quality video, and how most so called video purists have there monitors set to distort the video, just because the video feels better to them....

Its nice to have choices... and then you get into choices of CODECs... on and on.. Instead of burning discs, I save the HD footage I shoot to a iTunes library and play via my AppleTV, very nicely.


I AGREE with a lot of what you say , but have you ever heard of packet loss?

Have you ever had you internet connection throttle down?


And lastly have you ever lost internet service for an hour or 2 or more?



Hey I stream movies plus tv shows from netflix

and still rent from blockbuster online for blu ray.

I have less then 20 movies on hdd's. For movies I prefer to stream or rent a disc. Both methods have flaws.
 
I AGREE with a lot of what you say , but have you ever heard of packet loss?

Have you ever had you internet connection throttle down?

Trying to view a live stream has its issues, just like viewing HD cable, satellite, or OTA broadcasts. Broadcast is good for immediate gratification, I think.

I was trying to compare viewing a downloaded video image to BR for those that claim BR is pristine or the best it can get. That's a Sony marketing thing.

To me, internet streaming is a broadcast. Nothing wrong with that, some of us want to watch sporting events or news real time, but my on demand movie downloads "look" much better to me than streamed video. The download protocol corrects for packet loss and throttling as well as other transmission effects.
 
Last edited:
Oh my god! This has gotten completely out of hand.

That it has.

In reading through the dialog, it appears that a couple of points got missed.

First, any "A vs B" comparison that's not done blind is, unfortunately, suspect: we can't tell if the claimed differences are true, or if they are because the viewer knows which one is which and is introducing a Placebo Effect.

How can you tell me that I can't see a difference when I'm looking at them right here and one is blurry relative to the other one!???

Second, I think that there's a difference between the question of the human eyeball being able to resolve a 1080 vs 720 image (screen size X at distance Y of resolution Z), and the quality of an image at resolution Z ... ie, stuff like a "blurry" image that was introduced by data compression.

Of these two factors, however, I suspect that the Placebo Effect is more dominant here. I've done too much work on human research to know better than to pass over the obvious! :)


-hh
 
you forget that the info travels on a very long path from the apple server to your computer. countless tiny distortions occur inbetween A to H. The path of blu ray in your home to your tv is far shorter. much less chance of digital jitter.

What the...

Digital jitter doesn't exist. That's one of the features of a digital signal, there is no jitter.

Signal distortions don't distort a digital signal. Jitter only applies to an analog signal, and iTunes isn't (obviously) analog. Packet loss also doesn't effect digital signals. Digital compressed signals are all or nothing. Either you're getting it in the original quality, or you don't get it at all.

What is this? The dark ages? This is basic video stuff. Also iTunes 1080p is downloadable, not just streaming.
 
What the...

Digital jitter doesn't exist. That's one of the features of a digital signal, there is no jitter.

Signal distortions don't distort a digital signal. Jitter only applies to an analog signal, and iTunes isn't (obviously) analog. Packet loss also doesn't effect digital signals. Digital compressed signals are all or nothing. Either you're getting it in the original quality, or you don't get it at all.

What is this? The dark ages? This is basic video stuff. Also iTunes 1080p is downloadable, not just streaming.

you can belive that a download is a perfect 100 percent copy and matches what was sent from a server in apple land to your compter. never any corruption of data .
How do you prove that. You don't .

I can tell you it is not perfect and I can't prove that. To prove it is perfect you need direct access to apple's server copy. You need to do a bit by bit dump of the entire movie. You then need to do the same with your copy on your hdd.

If they are perfect copies you would only be correct in that the info is exactly the same.


That still does not mean much since the second part of my original post means the viewer's own eyesight come into question. Thirdly I would aruge that the best blu ray player does just that play blu rays while a computer is doing a lot of other tasks
 
you can belive that a download is a perfect 100 percent copy and matches what was sent from a server in apple land to your compter. never any corruption of data .
How do you prove that. You don't .

Yes, you can. If the frame isn't perfect, then it can't be decompressed, and you won't see anything on the screen at all.

I can tell you it is not perfect and I can't prove that.

You can't prove it because you're wrong.

You don't need access to Apple's data. This is how H.264 works. If the data doesn't come through perfectly, you won't see anything at all.

Learn how H.264 works please. H.264 can only produce either a perfect signal, or no signal at all. There is no middle ground, and no allowance for distortion. If there is any distortion, H.264 just breaks and you get no picture at all.

What you're claiming is like saying if a zip file gets "distorted" than your Word document comes out in a different color.

Again, digital signals do not allow distortion.

Here, have some basic reading:
http://en.wikipedia.org/wiki/Digital#Properties_of_digital_information

Errors: Disturbances (noise) in analog communications invariably introduce some, generally small deviation or error between the intended and actual communication. Disturbances in a digital communication do not result in errors unless the disturbance is so large as to result in a symbol being misinterpreted as another symbol or disturb the sequence of symbols. It is therefore generally possible to have an entirely error-free digital communication. Further, techniques such as check codes may be used to detect errors and guarantee error-free communications through redundancy or retransmission. Errors in digital communications can take the form of substitution errors in which a symbol is replaced by another symbol, or insertion/deletion errors in which an extra incorrect symbol is inserted into or deleted from a digital message. Uncorrected errors in digital communications have unpredictable and generally large impact on the information content of the communication.
 
That it has.

In reading through the dialog, it appears that a couple of points got missed.

First, any "A vs B" comparison that's not done blind is, unfortunately, suspect: we can't tell if the claimed differences are true, or if they are because the viewer knows which one is which and is introducing a Placebo Effect.



Second, I think that there's a difference between the question of the human eyeball being able to resolve a 1080 vs 720 image (screen size X at distance Y of resolution Z), and the quality of an image at resolution Z ... ie, stuff like a "blurry" image that was introduced by data compression.

Of these two factors, however, I suspect that the Placebo Effect is more dominant here. I've done too much work on human research to know better than to pass over the obvious! :)


-hh

I'm offering $10,000 to the first person who can catch me confused over which is which on a 30" ACD. You're welcome to have me over to your house (as long as you have a 30" ACD), and put the iTunes 1080p and Blu-Ray versions of a film side by side in a blind test for me. If I get even 1 out of 100 screenshot comparisons wrong, you get $10,000.

Before I get hot headed again, I'd like to simply ask, has anybody compared them side by side on a 30" ACD, and is still claiming that the difference is unnoticeable?
 
Well, some people here Are debating that there is no difference at all.. that's what elicited my agitated reaction.
Nope. Reading is fundamental. No one here is debating there's no difference at all. All along, I've simply said that your assessment that the quality of iTunes 1080p vs bluray 1080p is similar to SD vs. HD is just absolutely absurd.

I'm offering $10,000 to the first person who can catch me confused over which is which on a 30" ACD. You're welcome to have me over to your house (as long as you have a 30" ACD), and put the iTunes 1080p and Blu-Ray versions of a film side by side in a blind test for me. If I get even 1 out of 100 screenshot comparisons wrong, you get $10,000.

Before I get hot headed again, I'd like to simply ask, has anybody compared them side by side on a 30" ACD, and is still claiming that the difference is unnoticeable?
You're literally comparing one movie: The Shawshank Redemption. Try comparing the 1080p bluray vs. 1080p iTunes version of a Pixar film, or something that isn't a vintage period piece film with intentional grain shot on film made to look like it takes place in 1940. If you do that, you may want to lower your bet to $10 and not $10,000 because even a pointless bet like this will be far less obvious, even with the placebo effect you're ignoring.
 
That it has.

In reading through the dialog, it appears that a couple of points got missed.

First, any "A vs B" comparison that's not done blind is, unfortunately, suspect: we can't tell if the claimed differences are true, or if they are because the viewer knows which one is which and is introducing a Placebo Effect.



Second, I think that there's a difference between the question of the human eyeball being able to resolve a 1080 vs 720 image (screen size X at distance Y of resolution Z), and the quality of an image at resolution Z ... ie, stuff like a "blurry" image that was introduced by data compression.

Of these two factors, however, I suspect that the Placebo Effect is more dominant here. I've done too much work on human research to know better than to pass over the obvious! :)


-hh

That may be why the Video pros have developed objective tools sets that take all the placebo or wishful thinking effects out. These tools are very effective in determining video quality in terms of resolution, color accuracy, ability to present black, and the effects one or another component may have. It is well known that certain distortions provide pleasant effects for some individuals, Bose is well noted for exploiting the audio phenomenon.

The bottom line here is that digital video is digital video, a 50GB master file is the same bits wherever its stored. But a 50GB master digital movie is going to look a little different when compressed onto a 20 GB BR disk than compressed to 5GB for streaming or 1GB for an iPod. So perhaps you should be thinking about download that 50GB master to your computer. That way you can view the video without any compression artifacts. There are a lot of things that can go wrong in how your monitor converts the frames for display, however.

Minor point is that there are bit jitter and multipath effects in over the air digital video. But as mentioned, when these effects get to the point where the received cannot recover the bits, the receiver will simply drop the frame, there is either good video or no video (there is a lot of redundancy and error correction behind the scenes). Now deep in the bowels of your monitor there may be some bit jitter that effects what you see on the screen as the monitor converts the digital signal to pixels and light/color. That monitor introduced jitter would be the same regardless of the quality of the digital video, however.
 
I'm offering $10,000 to the first person who can catch me confused over which is which on a 30" ACD. You're welcome to have me over to your house (as long as you have a 30" ACD), and put the iTunes 1080p and Blu-Ray versions of a film side by side in a blind test for me. If I get even 1 out of 100 screenshot comparisons wrong, you get $10,000.

Before I get hot headed again, I'd like to simply ask, has anybody compared them side by side on a 30" ACD, and is still claiming that the difference is unnoticeable?

You need to be more specific than saying 1080p as thats just the resolution. You need to also at least talk about frame rate and bit depth.

I think iTunes 1080p video is currently not all that good, has some compression, but most of the DirecTV downloads are pretty good. There was a time just a bit ago that 720p was good enough for apple. Hopefully they will keep improving where BR may have reached it limit. A trained eye can tell BR video by the artifacts introduced by BR disc encoding. It is unique. I've made a few$$ myself that way :).

I like to use the transformers movie to demonstrate video and audio capability. That scene with a narrow electrical wire overhead on a light sky will be jagged or non existent on a 30 inch ADC or 60 inch Kuro depending on the pressing of the BR disk (BR quality varies). Now when I play the same scene from a HD DVD, the wire is smooth and seamless. I love to show that to BR fanatics and watch them sweat, make up excuses.. rationalize... they just can't believe it :)

Anyway a master video can be distributed in many methods, each of which will have its effects. BR will have some artifacts, a 1080p video file sitting on a server somewhere may not. How does netfix video look?

IMHO, BR will be dead in 5 years, replaced by online distributers of higher quality downloads for the fanatics, and cheap ones for the not so critical viewer. ....
 
I've just received my first shipment of Blu-Rays from Amazon, and I've taken peeks at Attack of the Clones, Passion of the Christ and Michael Clayton.

Remember how confident I was, based only on the appearance of the Shawshank Blu-Ray, that there was no way the iTunes could look as good? I feel the same way about all the Blu-Rays I've looked at so far (I've also watched The Duchess and Elizabeth within the last few days).

I don't think the pristine look of the Blu-Ray and the clearly noticeable difference from iTunes 1080p applies only to films like Shawshank that have intentional grain.
 
You do know Ars has already done a direct comparison, right?

http://arstechnica.com/apple/2012/03/the-ars-itunes-1080p-vs-blu-ray-shootout/

Considering how big those pictures were blown up, you'd need a big display to tell the difference.

Good commentary on the comparison too. Even has nice direct comparison screenshots.
http://arstechnica.com/civis/viewtopic.php?f=2&t=1169565

Like the Ars article notes, the only reason I ever buy Bluray is when I want the surround sound. If you don't have surround sound, what's the point?

BRD is a drm'd format anyway. It's not like you've bought a disk that will even necessarily play forever...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.