Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry but I can’t support this anymore. I just plugged. Anew hdmi cord into my Apple TV because my old cord didn’t support hdr. That all hdmi cords are equal is a myth.

Did you bother to read what I wrote? That’s exactly what I said.

I’ll help you by quoting myself: “As long as it’s the latest specs, a $10 cable works just as well as an $85 one.”
 
I leave mine on 4kSDR until I play a 4k HDR movie. When i enable that on the Apple tv and then set my projector to HDR mode, I am extremely happy with the picture quality.
I tried doing that, there is a huge difference on brightness.
My HT lights were off, I'm playing with the menu on the AppleTv4k and I decide to turn off the HDR option and oh boy, the gray menu almost blinded* me.
(*Exaggerating for audience entertainment lol)
Do you think turning off HDR is a must for non HDR content?
[doublepost=1506681041][/doublepost]
4K streaming at 20 to 25 Mbps is about the same rate at SDR blu ray discs. That's why 4K streaming looks like blu ray. UHD discs are twice the bit rate as SDR blu ray.
Thats what I was saying, I think the look like physical blu-ray disc and maybe a little more, which is a good thing, but it does not compare to physical UHD HDR Blu-rays.
 
Setting up the Apple TV 4K with your TV can require some doing, because my TV for example didn't want to display HDR in 'computer mode' and then you have to turn off all the built-in processing garbage on the TV. Took me a half hour of futzing with it.

Apple recommends this cable to pair with the box and I can attest that a cable can work on an older TV and then when connected to a different TV fail. The Belkin HDMI cable rated for 4K $29.95 on apple.com.
https://www.apple.com/shop/product/HLL52ZM/A/belkin-ultra-high-speed-hdmi-cable-2m

I wouldn't pay more than $10 for an HDMI cable. The only thing you need is a HDMI 2.0 cable, preferably gold plated that's "thick."
 
I tried doing that, there is a huge difference on brightness.
My HT lights were off, I'm playing with the menu on the AppleTv4k and I decide to turn off the HDR option and oh boy, the gray menu almost blinded* me.
(*Exaggerating for audience entertainment lol)
Do you think turning off HDR is a must for non HDR content?

No, not at all.
Each brand of TV adjusts the backlight brightness and Contrast in different ways for HDR content.
So if the picture quality is great for you in HDR mode, then leave it. You're lucky.
 
Last edited:
Yes the 2016 8000 series= 65" seemed to have the luck of the draw when it came to the vendor, banding, light bleed and so on. The 60" has light bleed like a stuck pig.

Next year will be a good year to purchase a new set. Have been a samsung fan but I will be open to an alternative.
[doublepost=1506612722][/doublepost]

it is very confusing.

I was an audio/video snob. One thing I realized I was so worried about perfect I forgot how to enjoy watching a movie.

I had all high end amps, processors, tv's and speakers.

Now I have an xboxS and apple tv, bose sound bar and sub. I enjoy watching tv again.

you were never really a true audio snob if you downgraded to a bose. :)
 
you were never really a true audio snob if you downgraded to a bose. :)

Bose has its niche.
I have one of the first wave radios, it has to be 20+ years old. It is fully functional.

I had a Klipsch Reference 7 7.x set up being pushed with parasound halo amps and multiple SVS ultra subs and a a few different processors.

now I turn on the TV or watch a movie I am not fumbling around with different things.

Sure I miss the quality of some good music or other audio. But I sure love the convenience.
 
Well said, there’s a vast, vast difference between well made and more expensive. Personally I’m using the same inexpensive cable I bought donkeys years ago and I’m fussy.

Digital output either gets to your TV or not at all, there is no further quality difference besides that.
 
From what i've seen of My UHD Disks sometimes the difference between the UHD and Blu Ray is Minimal, HDR makes the most difference in Deep colors, otherwise sometimes its minimal
 
Digital output either gets to your TV or not at all, there is no further quality difference besides that.
That is the point. Low quality cables that do not meet the HDMI 2.0 spec for bitrate, are not able to consistently transfer the digital signal. This results on data loss that can cause artifacts in your image/sound. Even if the old cable works, sound may pop as the stream falls out of sync, and your image can have banding or other visual anomalies.
This is not to say that $85 dollar cables are justified. Those are far over priced. But your original cheap cables from 10 years ago may not be fit to task with today’s equipment.
 
If you want to get the exact media info for the files you are streaming, you can do the following.

If you have a Mac updated to High Sierra with xcode 9, follow the steps on this page under the "Steps to set up Apple TV" area: https://stackoverflow.com/questions/...-9-with-ios-11

Then once you've finished that, go to the settings app on the Apple TV, and there will be a new developer section. In there you can turn on the playback hud towards the bottom.

Turning this option on in the developer settings will get you information like this displayed on all the screens where a video is playing: https://imgur.com/uXNT0Mm

That is a screenshot of Wonder Women in 4K from iTunes. These are most definitely true 4K/HDR files, at actually a significantly better bitrate that most other 4K streaming services.
Looks like Apple has some tricks up his sleeve with this iTunes Store streaming.
What I have noticed so far:
1) First of all, I would not be surprised they use their homebrewn HTTP Live Streaming protocol as the backbone. It definitely seems to scale the picture according to available bandwidth.
2) It will stream different content, based on HDR or SDR setting in Audio & Video settings
3) It seems to use some unknown codec designations: qdh1 for HDR/DV content, qhvc for SDR content and qac3 for surround audio. Could designate a DRM version of codec.
4) HDR content seems to arrive in Dolby Vision format and is locally translated into HDR10 if display does not support DV (like mine).
I draw these conclusions from my tests with the developer mode media info display.

Here is the stream info for a purchased movie when aTV is in 4k HDR mode:
IMG_2466.jpg
You notice, that incoming video is in Dolby Vision and in qdh1 codec. Its size is 2230x928 which is expanded to UHD display size (3840x1597, 2,4:1 aspect ratio)

Here the same stream, now my aTV is in 4K SDR mode:
IMG_2467.jpg
Everything almost unchanged, except that video comes now in in SDR rendering, in qhvc codec.
Obviously, video size is downgraded proportionally to network bandwidth. My 55,11Mbps is less than half of JWort93's 124,95Mbps and my incoming frame size is smaller than his. Accordingly, video bitrate is only 1/3 of his (9Mbps vs 25Mbps).

Just for fun, same data for 2 clips from my local hard drive, served via iTunes Home Sharing.

Regular FHD movie (1080p)
IMG_2469.jpeg
You can see standard codecs - avc1 and ac3, SDR rendering, incoming 1920x800 frame is upscaled to display size of 3842x1607 (2,4:1 aspect ratio).

My own 4K HDR10 file in HEVC:
IMG_2468.jpg
Still standard codecs hvc1 and aac, HDR10 rendering, source frame is unscaled, so displayed frame has same size of 3840x2160.
Important to note - since incoming video range is HDR10 but outgoing range is SDR (aTV video output was set to 4K SDR), all chroma is lost an my telly displays a black-and-white image!
Switching aTV video output to 4K HDR yields a proper HDR rendeing of same movie.
 
Last edited:
Looks like Apple has some tricks up his sleeve with this iTunes Store streaming.
What I have noticed so far:
1) First of all, I would not be surprised they use their homebrewn HTTP Live Streaming protocol as the backbone. It definitely seems to scale the picture according to available bandwidth.
2) It will stream different content, based on HDR or SDR setting in Audio & Video settings
3) It seems to use some unknown codec designations: qdh1 for HDR/DV content, qhvc for SDR content and qac3 for surround audio
4) HDR content seems to arrive in Dolby Vision format and is locally translated into HDR10 if display does not support DV (like mine).
I draw these conclusions from my tests with the developer mode media info display.

Here is the stream info for a purchased movie when aTV is in 4k HDR mode:
View attachment 722473
You notice, that incoming video is in Dolby Vision and in qdh1 codec. Its size is 2230x928 which is expanded to UHD display size (3840x1597, 2,4:1 aspect ratio)

Here the same stream, now my aTV is in 4K SDR mode:
View attachment 722472
Everything almost unchanged, except that video comes now in in SDR rendering, in qhvc codec.
Obviously, video size is downgraded proportionally to network bandwidth. My 55,11Mbps is less than half of JWort93's 124,95Mbps and my incoming frame size is smaller than his. Accordingly, video bitrate is only 1/3 of his (9Mbps vs 25Mbps).

Just for fun, same data for 2 clips from my local hard drive, served via iTunes Home Sharing.

Regular FHD movie (1080p)
View attachment 722470
You can see standard codecs - avc1 and ac3, SDR rendering, incoming 1920x800 frame is upscaled to display size of 3842x1607 (2,4:1 aspect ratio).

My own 4K HDR10 file in HEVC:
View attachment 722471
Still standard codecs hvc1 and aac, HDR10 rendering, source frame is unscaled, so displayed frame has same size of 3840x2160.
Important to note - since incoming video range is HDR10 but outgoing range is SDR (aTV video output was set to 4K SDR), all chroma is lost an my telly displays a black-and-white image!


A bitrate of 8 to 9 Mbps for HDR is pretty bad.
 
Thanks for posting 5his info.

FYI, they are not converting DV to hdr10. Rather there is an HDR base layer in their DV encode which can be rendered to hdr10, or so it seems based on the fact that Apple uses the 10-bit ST.2084 PQ single layer profile: "HEVC Dolby Vision (Profile5)/HDR10 (Main 10 profile) up to 2160p".

https://www.apple.com/apple-tv-4k/specs/

Looks like Apple has some tricks up his sleeve with this iTunes Store streaming.
What I have noticed so far:
1) First of all, I would not be surprised they use their homebrewn HTTP Live Streaming protocol as the backbone. It definitely seems to scale the picture according to available bandwidth.
2) It will stream different content, based on HDR or SDR setting in Audio & Video settings
3) It seems to use some unknown codec designations: qdh1 for HDR/DV content, qhvc for SDR content and qac3 for surround audio
4) HDR content seems to arrive in Dolby Vision format and is locally translated into HDR10 if display does not support DV (like mine).
I draw these conclusions from my tests with the developer mode media info display.

Here is the stream info for a purchased movie when aTV is in 4k HDR mode:
View attachment 722473
You notice, that incoming video is in Dolby Vision and in qdh1 codec. Its size is 2230x928 which is expanded to UHD display size (3840x1597, 2,4:1 aspect ratio)

Here the same stream, now my aTV is in 4K SDR mode:
View attachment 722472
Everything almost unchanged, except that video comes now in in SDR rendering, in qhvc codec.
Obviously, video size is downgraded proportionally to network bandwidth. My 55,11Mbps is less than half of JWort93's 124,95Mbps and my incoming frame size is smaller than his. Accordingly, video bitrate is only 1/3 of his (9Mbps vs 25Mbps).

Just for fun, same data for 2 clips from my local hard drive, served via iTunes Home Sharing.

Regular FHD movie (1080p)
View attachment 722470
You can see standard codecs - avc1 and ac3, SDR rendering, incoming 1920x800 frame is upscaled to display size of 3842x1607 (2,4:1 aspect ratio).

My own 4K HDR10 file in HEVC:
View attachment 722471
Still standard codecs hvc1 and aac, HDR10 rendering, source frame is unscaled, so displayed frame has same size of 3840x2160.
Important to note - since incoming video range is HDR10 but outgoing range is SDR (aTV video output was set to 4K SDR), all chroma is lost an my telly displays a black-and-white image!
 
ist.jpg 2nd.jpg
Looks like Apple has some tricks up his sleeve with this iTunes Store streaming.
What I have noticed so far:
1) First of all, I would not be surprised they use their homebrewn HTTP Live Streaming protocol as the backbone. It definitely seems to scale the picture according to available bandwidth.
2) It will stream different content, based on HDR or SDR setting in Audio & Video settings
3) It seems to use some unknown codec designations: qdh1 for HDR/DV content, qhvc for SDR content and qac3 for surround audio
4) HDR content seems to arrive in Dolby Vision format and is locally translated into HDR10 if display does not support DV (like mine).
I draw these conclusions from my tests with the developer mode media info display.

Here is the stream info for a purchased movie when aTV is in 4k HDR mode:
View attachment 722473
You notice, that incoming video is in Dolby Vision and in qdh1 codec. Its size is 2230x928 which is expanded to UHD display size (3840x1597, 2,4:1 aspect ratio)

Here the same stream, now my aTV is in 4K SDR mode:
View attachment 722472
Everything almost unchanged, except that video comes now in in SDR rendering, in qhvc codec.
Obviously, video size is downgraded proportionally to network bandwidth. My 55,11Mbps is less than half of JWort93's 124,95Mbps and my incoming frame size is smaller than his. Accordingly, video bitrate is only 1/3 of his (9Mbps vs 25Mbps).

Just for fun, same data for 2 clips from my local hard drive, served via iTunes Home Sharing.

Regular FHD movie (1080p)
View attachment 722470
You can see standard codecs - avc1 and ac3, SDR rendering, incoming 1920x800 frame is upscaled to display size of 3842x1607 (2,4:1 aspect ratio).

My own 4K HDR10 file in HEVC:
View attachment 722471
Still standard codecs hvc1 and aac, HDR10 rendering, source frame is unscaled, so displayed frame has same size of 3840x2160.
Important to note - since incoming video range is HDR10 but outgoing range is SDR (aTV video output was set to 4K SDR), all chroma is lost an my telly displays a black-and-white image!

just got my ATV4K, my tv setup as 4K HDR 60hz, my purchased 4k hdr10 movies with streaming file info, I'm not an expert but am i watching my movies in real true 4k ? thanks.
View attachment 722487 View attachment 722487 View attachment 722487 View attachment 722488
 
This is kind of an update to an earlier post I made.

The ATV is a massive disappointment. It is just fundamentally broken. The problem with it not being able to automatically switch resolution is even more disastrous than I initially thought. I totally take back everything that I said about early reviewers.

If I have my mode set to Dolby Vision and I run across a movie that's in 4K/HDR then it will either hang and not play at all or play that content in Dolby Vision. This tells me the box isn't actually streaming in proper HDR at all, much less 4k because I swear these streams are 1080p and my TV is just upscaling it as I've said before, I usually, CAN, tell the difference.

Logically thinking, how can this box actually detect the appropriate stream to pull from the ATV Webservers if it has no way to actually, automatically DETECT what content is actually streaming? Now I know what the catch is to the free 4k/HDR upgrades. It's not delivering the proper streams at all. It's almost as if it's a "placebo" affect.

I look at the same 4K movie on my Shield (which I hate) or my Vudu app on my TV and it's night and day difference.

Returning my ATV and will just use my ATV 4 until this garbage is fixed.

You expected true 4K picture quality? You obviously don’t understand how Apple works. If they provided true 4K, how would they make you buy the next Apple TV?
 
FYI, they are not converting DV to hdr10. Rather there is an HDR base layer in their DV encode which can be rendered to hdr10, or so it seems based on the fact that Apple uses the 10-bit ST.2084 PQ single layer profile: "HEVC Dolby Vision (Profile5)/HDR10 (Main 10 profile) up to 2160p".
You are right, I guess. First, not every 4K HDR movie in the Store is labeled with DV, some are just HDR, so I assume these are encoded in plain HDR10. Second, HDR10/DV dual-layer seems to be what BDA uses as well. At least the Despicable Me UHD BD uses exact same technique for HDR10 compatibility and optional DV advancement.
You expected true 4K picture quality? You obviously don’t understand how Apple works. If they provided true 4K, how would they make you buy the next Apple TV?
That is a random drift, imho. How does a new device affect their Store content or my distance to their closest distribution hub (which is not even theirs but Akamai's)??
Current device provides already true 4K in every way, if you feed it the true content. Attributing the streaming issues to the device is simply unjust.
If they really also include Atmos support sometime in the future, it will be one helluva device.
 
You are right, I guess. First, not every 4K HDR movie in the Store is labeled with DV, some are just HDR, so I assume these are encoded in plain HDR10. Second, HDR10/DV dual-layer seems to be what BDA uses as well. At least the Despicable Me UHD BD uses exact same technique for HDR10 compatibility and optional DV advancement.
That is a random drift, imho. How does a new device affect their Store content or my distance to their closest distribution hub (which is not even theirs but Akamai's)??
Current device provides already true 4K in every way, if you feed it the true content. Attributing the streaming issues to the device is simply unjust.
If they really also include Atmos support sometime in the future, it will be one helluva device.
thanks, i have learn something.
 
You are right, I guess. First, not every 4K HDR movie in the Store is labeled with DV, some are just HDR, so I assume these are encoded in plain HDR10. Second, HDR10/DV dual-layer seems to be what BDA uses as well. At least the Despicable Me UHD BD uses exact same technique for HDR10 compatibility and optional DV advancement.
That is a random drift, imho. How does a new device affect their Store content or my distance to their closest distribution hub (which is not even theirs but Akamai's)??
Current device provides already true 4K in every way, if you feed it the true content. Attributing the streaming issues to the device is simply unjust.
If they really also include Atmos support sometime in the future, it will be one helluva device.

Apple uses a different profile for DV than UHD but both seem to contain a non DV base layer.
 
Looks like Apple has some tricks up his sleeve with this iTunes Store streaming.
What I have noticed so far:
1) First of all, I would not be surprised they use their homebrewn HTTP Live Streaming protocol as the backbone. It definitely seems to scale the picture according to available bandwidth.
2) It will stream different content, based on HDR or SDR setting in Audio & Video settings
3) It seems to use some unknown codec designations: qdh1 for HDR/DV content, qhvc for SDR content and qac3 for surround audio
4) HDR content seems to arrive in Dolby Vision format and is locally translated into HDR10 if display does not support DV (like mine).
I draw these conclusions from my tests with the developer mode media info display.

Here is the stream info for a purchased movie when aTV is in 4k HDR mode:
View attachment 722473
You notice, that incoming video is in Dolby Vision and in qdh1 codec. Its size is 2230x928 which is expanded to UHD display size (3840x1597, 2,4:1 aspect ratio)

Here the same stream, now my aTV is in 4K SDR mode:
View attachment 722472
Everything almost unchanged, except that video comes now in in SDR rendering, in qhvc codec.
Obviously, video size is downgraded proportionally to network bandwidth. My 55,11Mbps is less than half of JWort93's 124,95Mbps and my incoming frame size is smaller than his. Accordingly, video bitrate is only 1/3 of his (9Mbps vs 25Mbps).

Just for fun, same data for 2 clips from my local hard drive, served via iTunes Home Sharing.

Regular FHD movie (1080p)
View attachment 722470
You can see standard codecs - avc1 and ac3, SDR rendering, incoming 1920x800 frame is upscaled to display size of 3842x1607 (2,4:1 aspect ratio).

My own 4K HDR10 file in HEVC:
View attachment 722471
Still standard codecs hvc1 and aac, HDR10 rendering, source frame is unscaled, so displayed frame has same size of 3840x2160.
Important to note - since incoming video range is HDR10 but outgoing range is SDR (aTV video output was set to 4K SDR), all chroma is lost an my telly displays a black-and-white image!

Did you do a speedtest on your connection? I don't understand why a 55Mb connection is struggling to max out a stream, when my 67Mb connection has no problems achieving full bitrates in Netflix. Unless Apple aren't able to cope at certain times of the day, depending on demand? They are worrying results though as 9Mbps is woeful.

Edit - I would be interested to see if results differ with quick-start disabled.
 
Last edited:
I have no issues at all, and actually think Apple's approach is the best given the inconsistent performance of many displays and TVs when it comes to mode switching.

Even if your ATV 4K is permanently set to HDR, movies mastered in SDR will correctly be displayed, as the HDR spec allows for communication of the Dynamic Range and Mastering infoframe associated with the content to your display/TV.

Think of it this way. HDR is a superset of SDR, so the ATV can probably still correctly communicate SDR luminance levels to the display even when in "HDR" mode. There is not necessarily a conversion to HDR.

If Apple yields to your complaints and begins supporting dynamic mode switching, you would be the first to complain that "why is the screen blank for 10s?".
[doublepost=1506845903][/doublepost]Regarding "the conversion of frame rate from 24Hz to 60Hz", note that most modern TVs actually convert to 120Hz. Either the ATV does it or your TV will do it. I'm fine with the ATV doing it for me.
 
Last edited:
OK, I have done my own testing - quick start is the culprit here.

With quick-start disabled, the source doesn't produce a Dolby Vision signal, is capped at 1080p and the bitrate is limited to 4.5mbps.
with quick-start enabled, the source *does* produce a Dolby Vision signal, at 4k resolution, and the bitrate is 21.65mbps.

On the same connection, so quick-start is broken big-time.
 
I have no issues at all, and actually think Apple's approach is the best given the inconsistent performance of many displays and TVs when it comes to mode switching.

Even if your ATV 4K is permanently set to HDR, movies mastered in SDR will correctly be displayed, as the HDR spec allows for communication of the Dynamic Range and Mastering infoframe associated with the content to your display/TV.

Think of it this way. HDR is a superset of SDR, so the ATV can probably still correctly communicate SDR luminance levels to the display even when in "HDR" mode. There is not necessarily a conversion to HDR.

If Apple yields to your complaints and begins supporting dynamic mode switching, you would be the first to complain that "why is the screen blank for 10s?".
[doublepost=1506845903][/doublepost]Regarding "the conversion of frame rate from 24Hz to 60Hz", note that most modern TVs actually convert to 120Hz. Either the ATV does it or your TV will do it. I'm fine with the ATV doing it for me.

Most of this post is incorrect.

1) Displaying SDR films in hdr mode is clearly resulting in an incorrect image. This is blatantly obvious to see and has been highlighted in a number of reviews.

2) Outputting 1080 films at 2160p is using the Apple TV’s scaler which is overly sharpening the image. This results in a poorer quality image. It’s very noticeable but may not be on some TVs if their in built scaler is just as poor.

3) The Apple TV can only output HDR at a maximum of 60hz so it’s not converting to 120hz. Even if your tv supports 120hz it may not be able to accurately remove a 24fps image from a 60hz signal. Many TVs will just double the 60hz signal and judder will be very apparent.

4) Why would there be any need for people’s screen to go blank for 10 seconds? My £100 Kodi box goes blank for about 1 second when it switches resolution, frame rate or from SDR to HDR. And besides....nobody complains their Blu Ray player is broken when it switches to the native resolution and frame rate.
 
  • Like
Reactions: cyb3rdud3
Exactly. I'd contend that the scaling isn't *that* big a deal, but anyone arguing that leaving HDR mode enabled has no effect on SDR image quality is deluded. It completely blows out contrast on my set, destroying white detail. There is a reason why calibrators calibrate both separately and why sets have independent settings for HDR modes. Because they are so different!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.