But the issue is over hardware support. It could be that Apple chooses not to make use of the GPU, but we won't know until High Sierra actually comes out. Hybrid coding has been around since Broadwell maybe? Which is why is says in that chart "All Macs" for software decode. Skylake is more than capable of encoding/decoding 10bit, it's just Kabylake is more efficient with it, so will be better on battery and CPU usage. And from a working perspective, you'll be able to encode stuff just fine with a lot of Macs.
Well not really. That's not correct, unless you're talking software decoding/encoding. Skylake has full 8-bit hardware decode, but only hybrid 10-bit decode. However, Apple seems to have chosen not to bother with hybrid decoding on Skylake. It's either hardware 8-bit decoding or software for 10-bit.
For encoding, Skylake has partial hardware encoding support for 8-bit, and no hardware encoding at all for 10-bit. It must be in software.
Software works well, but it is very CPU intensive, and if you have to use that all the time you'll be getting loud fans and crappy battery life, as well as lousy multitasking.
If you are not aware, that slide that I posted above is Apple's slide, indicating what they do and what they don't support. It's right from the horse's mouth. It's a slide from from one of the developer sessions at WWDC.
Real time playback I feel is the concern people are having? Which is fair but we really need to wait for release to find out that, technically there's no reason Skylake would have an issue with it though, regardless of AMD support.
As mentioned, Apple does not support hardware decode of 10-bit HEVC on Skylake, period. This has already been confirmed by some users running the High Sierra beta. And one shouldn't expect this to change, considering that Apple has already flat out stated they don't support hardware decode of 10-bit HEVC on Skylake.
For what it's worth it doesn't bother me at all so I'm not trying to defend anything. And as a question, are people really watching 10bit 4K content on their MacBook Pros that have 8bit '3K' screens? If that were the case it seems a little OTT to me, I suppose you could port it out but why wouldn't you use a $100 Blu-Ray player instead of faffing with cables?
It's a Pro laptop after all. Why wouldn't you want to be able to import a 4K 10-bit file and scrub through it on your Pro video editing machine for example?
Plus, if you happen to have a 4K 10-bit file, it's not as if you're going to transcode it to 8-bit 1080p first before you watch it.
BTW, these were the same arguments made when h.264 was first introduced by Apple. Now look at where we are with h.264. If your computer doesn't support h.264 hardware decode, it's a real PITA even just for surfing.
---
So in 2017, if all that is holding you back from getting a Kaby Lake machine over a Skylake machine is $180, that's not a very good reason, unless you have a really, really tight budget.