Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Penlocky

macrumors member
Original poster
Feb 7, 2017
82
20
Hey everyone!

It is known that Netflix 4k support is on Kaby Lake processors only.
In addition to running on Windows 10 Edge browser only as well.

http://www.theverge.com/2016/11/21/13703152/netflix-4k-pc-windows-support

All in all, it is all due to Kaby Lake having HDCP 2.2 DRM hardware inside the CPU.
While the 2016 MacBook Pros run on Skylake, which has no HDCP 2.2.

However, the AMD GPUs on the 15" models has HDCP 2.2 according to AMD.

Any idea if this means 15" MBPs might one day get Netflix 4k support without Kaby Lake, or will the machines never be able to because of the Skylake chips? That would make the 2016 MacBook Pro 15" be obsolete quick and not future proof for 4K.

Have anyone tried installing windows, using the Edge browser and tried Netflix 4k on the 2016 15" MacBook Pro?

I would want to return it if it cannot do 4K with HDCP 2.2 protection.

For people saying "MacBook Pro 15" screen is not enough for 4K!", I meant plug the MacBook to a HDCP 2.2 compliant 4K screen.

Thank you very much for answering everyone!
Screen Shot 2017-02-07 at 11.49.50 PM.png
 

Attachments

  • Screen Shot 2017-02-07 at 11.49.50 PM.png
    Screen Shot 2017-02-07 at 11.49.50 PM.png
    904.1 KB · Views: 452
the issue isnt HDCP support. Kaby Lake has hardware DRM which allows the processor to decrypt the DRM'd movie in hardware. This is much more secure than software DRM like fairplay or older versions of microsoft playready.

You will need to wait for Kaby Lake if you want to play 4K DRM content
 
the issue isnt HDCP support. Kaby Lake has hardware DRM which allows the processor to decrypt the DRM'd movie in hardware. This is much more secure than software DRM like fairplay or older versions of microsoft playready.

You will need to wait for Kaby Lake if you want to play 4K DRM content
------------------
Yeah the Kaby Lake has built in DRM hardware.
However the AMD Radeon Pro 460 also has built in DRM hardware. But the Sky Lake does not. Would that mean 4K DRM content can be played still on the AMD 460 with built in DRM hardware?
 
Is it even possible to force GPU rendering of video, from either MacOS or bootcamp Windows?

I tried a 4k UHD sample video, it didn't play at all in MacOS. It did however play just fine in Windows, although the fans went crazy and the computer got rather hot before reaching the 1m mark in the video.

I have the 15" 2,9GHz AMD 460 setup btw.
 
Is it even possible to force GPU rendering of video, from either MacOS or bootcamp Windows?

I tried a 4k UHD sample video, it didn't play at all in MacOS. It did however play just fine in Windows, although the fans went crazy and the computer got rather hot before reaching the 1m mark in the video.

I have the 15" 2,9GHz AMD 460 setup btw.
Yeah I got your 15" AMD 460 set up just now. We got the same Laptop.

Testing the video, a I also got lags even though I turn automatic graphics switching off. I think you might be correct here that it still tries to use the iGPU instead. Maybe it needs forcing program like you mentioned. This is tempting me to wait for Kaby Lake model. =P

Have you tried the jelly fish videos? http://www.jell.yfish.us
I tried the 120 mbps-4k-uhd-hevc-10bit-main10 profile video and the video does not even move on automatic graphics switching off. If you can try that one on windows and tell me how it goes please. I was using VLC Media Player 2.4.4 on Mac OS Sierra.

Thank you!
[doublepost=1486574819][/doublepost]
Is it even possible to force GPU rendering of video, from either MacOS or bootcamp Windows?

I tried a 4k UHD sample video, it didn't play at all in MacOS. It did however play just fine in Windows, although the fans went crazy and the computer got rather hot before reaching the 1m mark in the video.

I have the 15" 2,9GHz AMD 460 setup btw.
In normal h.264, 8 bit, 1080p video, CPU usage is at max 10%. Energy drainage at max 50 in activity monitor. But if its 4K stuff, CPU usage is at max 50%, energy drainage max 450.

From Youtube videos of Kaby Lake testings vs Skylake. Kaby Lake process 4K at under 10% load like when older processor is handling 1080p content. So waiting for Kaby Lake will result in a very very future proof computer that can last 10 years and still do a fine job at watching 4K when it becomes mainstream. However Skylake models might result in 2016 MacBook Pro to fall obsolete within 5 years in my opinion.

What do you think?
 
Yeah I got your 15" AMD 460 set up just now. We got the same Laptop.

Testing the video, a I also got lags even though I turn automatic graphics switching off. I think you might be correct here that it still tries to use the iGPU instead. Maybe it needs forcing program like you mentioned. This is tempting me to wait for Kaby Lake model. =P

Have you tried the jelly fish videos? http://www.jell.yfish.us
I tried the 120 mbps-4k-uhd-hevc-10bit-main10 profile video and the video does not even move on automatic graphics switching off. If you can try that one on windows and tell me how it goes please. I was using VLC Media Player 2.4.4 on Mac OS Sierra.

Thank you!
[doublepost=1486574819][/doublepost]
In normal h.264, 8 bit, 1080p video, CPU usage is at max 10%. Energy drainage at max 50 in activity monitor. But if its 4K stuff, CPU usage is at max 50%, energy drainage max 450.

From Youtube videos of Kaby Lake testings vs Skylake. Kaby Lake process 4K at under 10% load like when older processor is handling 1080p content. So waiting for Kaby Lake will result in a very very future proof computer that can last 10 years and still do a fine job at watching 4K when it becomes mainstream. However Skylake models might result in 2016 MacBook Pro to fall obsolete within 5 years in my opinion.

What do you think?


Hey,

I've downloaded and tested that video on MacOS and Windows 10 now.

MacOS (+VLC): no playback. Just still image
Windows 10 (+MPC): video plays just fine. Very fluid and no stutter. The fans started to kick in the last few seconds though. Not feasible for a whole 100 minutes movie.

Regarding Skylake vs Kaby Lake; if I had an OK computer today, I would probably wait for KL. I got this one because I needed an upgrade from my MBA.

You're right in saying that KL would be more future proof for 4K content. So If you plan on viewing a lot of 4K content on your computer, KL would be the right choice, imo.

That being said, I'm happy with this computer, and will get a new MBP in 3 years probably. By then I'll have a 4K TV and there will most likely be 4K movies out. So I'm not in a dire need of a 4K capable computer ATM :)
 
  • Like
Reactions: jimthing
Thank you for trying that out! I really appreciate that testing.
That explains why it was unable to play on Mac OS. The OS tries to play it with the iGPU. Even with AMD on.
Fans start to kick in so this is like gaming on the Laptop. Agree, wouldn't be suitable to watch 4K in 5 years on the bed.
But damn, this computer is attractive!

While you are at it, do you have NetFlix 4K capable account? If you try to stream NetFlix 4K on Windows Edge browser, do you get HDCP 2.2 error?

Your MacBook Air was not performing fine? Or you wanted a 15" screen? Like me =P.
 
Thank you for trying that out! I really appreciate that testing.
That explains why it was unable to play on Mac OS. The OS tries to play it with the iGPU. Even with AMD on.
Fans start to kick in so this is like gaming on the Laptop. Agree, wouldn't be suitable to watch 4K in 5 years on the bed.
But damn, this computer is attractive!

While you are at it, do you have NetFlix 4K capable account? If you try to stream NetFlix 4K on Windows Edge browser, do you get HDCP 2.2 error?

Your MacBook Air was not performing fine? Or you wanted a 15" screen? Like me =P.

No worries!

I don't have Netflix 4K, so I can unfortunately not test that.. But I'm guessing that the error would show, since it's Skylake.

Haha no, I wanted a retina screen (muuuuch better than MBA screen), and wanted to try 15" this time :) And yes, it is an attractive computer haha :)
 
------------------
Yeah the Kaby Lake has built in DRM hardware.
However the AMD Radeon Pro 460 also has built in DRM hardware. But the Sky Lake does not. Would that mean 4K DRM content can be played still on the AMD 460 with built in DRM hardware?

Technically it could, but currently No, because Netflix 4k(on a Computer) requires Microsoft PlayReady 3.0, which is currently supported by the Edge Browser only(also no Chrome, no Firefox, etc.).

AMD->(breakpoint)->PlayReady3.0->Windows->Edge-Netflix4k=fails at the breakpoint
NVIDIA->(breakpoint)->PlayReady3.0->Windows->Edge->Netflix4k=fails at the breakpoint
Kaby Lake->PlayReady3.0->Windows->Edge->Netflix4k=OK

To date Microsoft PlayReady 3.0 only supports the Kaby Lake CPU/GPU.
AMD and NVIDIA technically supports it.
But to make Netflix 4k work with the new AMD or NVIDIA chips, they need to be added to support PlayReady 3.0 first. This did not happen yet.

Keep in mind PlayReady3.0 is Microsoft(Win10+Edge) only and this sucks anyway.
I'm not going to install and use Windows just because of Netflix4k, before this happens the Hell freeze.
 
Yes you summed it up pretty nicely. Netflix 4k DRM support pretty much is a windows thing. And NVidia as well as AMD have done their homework regarding PlayReady 3.0 already, aka. the drivers already support it. Netflix however has not enabled 4k on non Kaby Lake NVida or AMD configurations yet. It is up to Apple on how they are going to implement the DRM there so the situation on Windows regarding Kaby Lake is not necessarily what we will see in the future on the Mac.
 
No worries!

I don't have Netflix 4K, so I can unfortunately not test that.. But I'm guessing that the error would show, since it's Skylake.

Haha no, I wanted a retina screen (muuuuch better than MBA screen), and wanted to try 15" this time :) And yes, it is an attractive computer haha :)
I think it will show error too!
You cannot go back to non retina now I see. 15" is also great for screen size! I had the 13 inch MBP and I have been in dire need of getting the 15" real estate. And I do not want to return it.

Thank you for all the testing! Shows me how impractical it is to use dGPU on Laptop to do videos. On desktop it is fine.
[doublepost=1486629320][/doublepost]
Technically it could, but currently No, because Netflix 4k(on a Computer) requires Microsoft PlayReady 3.0, which is currently supported by the Edge Browser only(also no Chrome, no Firefox, etc.).

AMD->(breakpoint)->PlayReady3.0->Windows->Edge-Netflix4k=fails at the breakpoint
NVIDIA->(breakpoint)->PlayReady3.0->Windows->Edge->Netflix4k=fails at the breakpoint
Kaby Lake->PlayReady3.0->Windows->Edge->Netflix4k=OK

To date Microsoft PlayReady 3.0 only supports the Kaby Lake CPU/GPU.
AMD and NVIDIA technically supports it.
But to make Netflix 4k work with the new AMD or NVIDIA chips, they need to be added to support PlayReady 3.0 first. This did not happen yet.

Keep in mind PlayReady3.0 is Microsoft(Win10+Edge) only and this sucks anyway.
I'm not going to install and use Windows just because of Netflix4k, before this happens the Hell freeze.
Thank you for your analysis.
So it is the PlayReady 3.0 as another culprit. There is another problem. I heard displayport 1.2 that MacBook Pro uses does not support HDCP 2.2? So even though when Kaby Lake comes out, the computer might not be able to connect to a 4K TV because the displayport 1.2 will break the chain?
[doublepost=1486629381][/doublepost]
Yes you summed it up pretty nicely. Netflix 4k DRM support pretty much is a windows thing. And NVidia as well as AMD have done their homework regarding PlayReady 3.0 already, aka. the drivers already support it. Netflix however has not enabled 4k on non Kaby Lake NVida or AMD configurations yet. It is up to Apple on how they are going to implement the DRM there so the situation on Windows regarding Kaby Lake is not necessarily what we will see in the future on the Mac.
Probably until iTunes deploy 4k Movies for purchase and Apple TV support 4k, then that will be the time Apple start to tackle this issue.
 
You are definitely right about the Displayport 1.2 limitation breaking the hdcp chain. So even Kaby Lake will not help there. The max for TB3 devices is 2 DP 1.2 ports in a row at 4k is supported. There is theoretically the possibility to support HDCP via displayport 1.3 and a usb-c bridge. Not sure if this will ever be implemented though. Also it depends maybe there is a loophole in Thunderbolt 3 which allows to push anything through to the secondary graphics adapter (AMD 460) which theretically could push the displayport to 1.4. But I rather doubt it.
Either way not a big loss for me. I have enough 4k devices and the mac is a working machine.
 
No worries!

I don't have Netflix 4K, so I can unfortunately not test that.. But I'm guessing that the error would show, since it's Skylake.

Haha no, I wanted a retina screen (muuuuch better than MBA screen), and wanted to try 15" this time :) And yes, it is an attractive computer haha :)
When you tested the video on windows. Is there any tool to display GPU load? I do not understand why only decoding video would result in it running hot and fans start kicking in. I would expect it to only use around 10% AMD GPU power.
Any theories to this?
[doublepost=1486684055][/doublepost]
You are definitely right about the Displayport 1.2 limitation breaking the hdcp chain. So even Kaby Lake will not help there. The max for TB3 devices is 2 DP 1.2 ports in a row at 4k is supported. There is theoretically the possibility to support HDCP via displayport 1.3 and a usb-c bridge. Not sure if this will ever be implemented though. Also it depends maybe there is a loophole in Thunderbolt 3 which allows to push anything through to the secondary graphics adapter (AMD 460) which theretically could push the displayport to 1.4. But I rather doubt it.
Either way not a big loss for me. I have enough 4k devices and the mac is a working machine.
I would love to be as chill as you. I have this feeling that I would regret not waiting for Kaby Lake, because when I watch 1080p video on the 15" screen, its a little blurry because it is not native resolution. And if I want to watch 1440p or 4k on battery power, it will be impractical because AMD GPU will drain it too quickly. Do you feel concern for 1080p not being native resolution of the 15"?
 
When you tested the video on windows. Is there any tool to display GPU load? I do not understand why only decoding video would result in it running hot and fans start kicking in. I would expect it to only use around 10% AMD GPU power.
Any theories to this?

Dreadful drivers or simply bad codec, probably the decoding was offloaded to the CPU. I have a PC rig at home with an NVidia 1070 for gaming and serious processing purposes and decoding a video stream on that rig is neglectable load. Heck modern gpus have their own codec unit which is independend from the rest of the gpu so that they can encode/decode game streams on the fly. This is great if you want to play games on your tv while the PC itself is in another room.
 
  • Like
Reactions: jerryk
Dreadful drivers or simply bad codec, probably the decoding was offloaded to the CPU. I have a PC rig at home with an NVidia 1070 for gaming and serious processing purposes and decoding a video stream on that rig is neglectable load. Heck modern gpus have their own codec unit which is independend from the rest of the gpu so that they can encode/decode game streams on the fly. This is great if you want to play games on your tv while the PC itself is in another room.
Does the AMD Radeon Pro 460 have hardware based decoding for HEVC Main10 and VP9? I read in websites and it stated something like hardware acceleration, etc. I do not understand what do they mean? Is it the same as Kaby Lake having hardware decoder?

Really? You think its bad codec and driver? http://www.jell.yfish.us
The above website is where we took the file from. The file is a 120mbs HEVC main10 4k video. A video in
shows that kaby lake taks only 5% CPU power for that video while Skylake takes 60% power. So this should mean if AMD 460 has hardware decoder then it should use under 5% load? Otherwise its bad driver. (Because Kaby Lake can decode it smoothly means it shouldn't be bad codec.)
 
  • Like
Reactions: robotica
Yes the Radeon 460 is perfectly capable of HEVC encoding (I assume VP9 as well). As for Kaby Lake, this is just Skylake with HEVC. So call me mildly interested since the Radeon if triggered correctly should to that as well.
Probably the Radeon was not triggered in your case and the Skylake CPU took over the task. Have you checked your processor load?
 
Yes the Radeon 460 is perfectly capable of HEVC encoding (I assume VP9 as well). As for Kaby Lake, this is just Skylake with HEVC. So call me mildly interested since the Radeon if triggered correctly should to that as well.
Probably the Radeon was not triggered in your case and the Skylake CPU took over the task. Have you checked your processor load?
It was Dovvee who tested out on Windows because currently Mac OS, with VLC player does not off load the video into AMD GPU despite switching automatic switching off. He/she noted fans kick in at the end of a 30s video. I am waiting for his reply if he can view CPU and GPU loads.

Screen Shot 2017-02-10 at 8.24.03 AM.png

This is from radeon polaris wiki.

Screen Shot 2017-02-10 at 8.25.15 AM.png

This is from Kaby Lake.

Sorry for asking for clarification. What is the difference between these two descriptors? Polaris has "decode acceleration" while Kaby Lake has "fixed function decode". I heard somewhere that "accelerated" means its not native on hardware but is software based? Or am I wrong.

Thank you very much for sticking with me in determining these facts!
 
No Polaris decodes HEVC perfectly fine on the GPU. Acceleration just means that most if not all is done on the GPU.
It has been like that for years with H264 and it is pretty much the same for HEVC.
On the ARM side of things HEVC has been entirely offloaded to a separate unit for 2 years now, why should AMD be behind in that area.
If the processor has a high load then the reason simply is that the GPU was not used at all, for whatever reason (my guess goes for the Codec)
 
Okay, I've just spent more time than I should've on this haha.

I know now what the problem was with 100% CPU and fans getting loud, it was MPC not using HW-acceleration.
After installing k-lite codec pack and manually setting HW acceleration to my GPU, everything runs fine.

CPU load was low, GPU load was low (a few tiny spikes though), and the fans didn't kick in.

In short: 4K 10bit HDR videos run great, as long as HW decoding is set correctly :)

I hope this helps @Penlocky :)
 
No Polaris decodes HEVC perfectly fine on the GPU. Acceleration just means that most if not all is done on the GPU.
It has been like that for years with H264 and it is pretty much the same for HEVC.
On the ARM side of things HEVC has been entirely offloaded to a separate unit for 2 years now, why should AMD be behind in that area.
If the processor has a high load then the reason simply is that the GPU was not used at all, for whatever reason (my guess goes for the Codec)
I see now! Dovvee proved your point there. Thank you very much for you input. I now have enough info to decide.
Sounds like ARM has a GPU to do every graphic task for it. It is a bit inconvenience that we have to manage the GPU manually to get the desired codec on GPU though.
[doublepost=1486721480][/doublepost]
Okay, I've just spent more time than I should've on this haha.

I know now what the problem was with 100% CPU and fans getting loud, it was MPC not using HW-acceleration.
After installing k-lite codec pack and manually setting HW acceleration to my GPU, everything runs fine.

CPU load was low, GPU load was low (a few tiny spikes though), and the fans didn't kick in.

In short: 4K 10bit HDR videos run great, as long as HW decoding is set correctly :)

I hope this helps @Penlocky :)
Wow! I am sorry for asking too much and deep. Thank you for testing these through!
That is surprising that we have to manage the GPU manually. Otherwise the base software in Mac OS and Windows will mismanage. And Mac OS also is harder to manage because they control everything. (I'm not pro in tech, maybe I am wrong.)

Makes me feel its imperfect and my OCD is going crazy!

Thank you very much @dovvee
 
I see now! Dovvee proved your point there. Thank you very much for you input. I now have enough info to decide.
Sounds like ARM has a GPU to do every graphic task for it. It is a bit inconvenience that we have to manage the GPU manually to get the desired codec on GPU though.
[doublepost=1486721480][/doublepost]
Wow! I am sorry for asking too much and deep. Thank you for testing these through!
That is surprising that we have to manage the GPU manually. Otherwise the base software in Mac OS and Windows will mismanage. And Mac OS also is harder to manage because they control everything. (I'm not pro in tech, maybe I am wrong.)

Makes me feel its imperfect and my OCD is going crazy!

Thank you very much @dovvee


No problem at all :) I'm just sad that it doesn't work in MacOS. We'll just have to use Windows for 4K content I guess.
 
No problem at all :) I'm just sad that it doesn't work in MacOS. We'll just have to use Windows for 4K content I guess.
Yeah, that means if Apple doesn't update the driver, then the GPU will not kick in and result in CPU draining all that power. As well as we have no way of controlling HW acceleration. Kaby Lake =3.
I suspect once Apple starts rolling out 4k content, that it will start releasing driver updates and stuff.
 
Well that might take a while, given that basically none of their devices have real 4k capabilities yet. The Macbook pro has to some degree but only with an external display. Als then there is the HDCP 2.2 issue on those.
 
Well that might take a while, given that basically none of their devices have real 4k capabilities yet. The Macbook pro has to some degree but only with an external display. Als then there is the HDCP 2.2 issue on those.

My OCD is going crazy on those issues. The feeling of wanting a perfect machine that can do it all.
That is true on your side, as displayport wont be 1.4 for a while. Lastly, even 1080p videos might be encoded with h.265 HEVC in the future to save space and have better quality.
 
Have been following this (and the other Skylake vs Kaby Lake threads) with interest. Would a fair summary be:
  • The current MBP's discrete GPU can decode / encode HEVC main10 4K and VP9 fine, but uses more battery than a future Kaby Lake model would.
  • There's no HEVC main10 4K support in macOS anyway, so the Skylake/Kaby Lake difference is irrelevant unless you dual boot into Windows.
  • Unless they make other chipset changes, a future Kaby Lake MBP probably won't be able to output HEVC main10 4K to a TV anyway due to it only having DisplayPort 1.2.
Also, the internal MBP display isn't 4K (and even if it was, most people probably wouldn't see the difference anyway at a typical viewing distance). If all that is right, I'm not seeing the major benefit in waiting for Kaby Lake? I get the future-proofing argument, but honestly how many people who need a MBP routinely keep them for 5-10 years? I do have a 2011 MBP now, but it's just been obsoleted (so no more macOS upgrades I imagine) and had there been a chassis redesign earlier I'd likely have already upgraded.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.