Bad news, it seems Fury lacks HDMI 2.0 after all, DP1.2a is there but HDMI is limited at 1.4a according to someone at AMD. That's a step back I'd say.
How is DP1.2a a step back? Did previous AMD cards have DP1.3? No. HDMI 1.4a might be a bit of step back since some AMD cards supported HDMI 1.4b. I suspect this card does 1.4b but since "b" is just 3D 1080p @ 120Hz( which isn't "hot" anymore ) 1.4a is about as descriptive. But no AMD card ever had HDMI 2.0 so how can they be going backwards from a point they were never at?
[ from the informative tech specs aspect AMD's website blows chunks. ]
Technically with the right firmware support DP1.2a can support a HDMI 2.0 converter.
"...
Can I connect a DisplayPort 1.3 device to a new television that supports HDMI 2.0?
... Some DisplayPort 1.2a systems will most likely be upgradable through firmware to support this new feature.
..."
http://www.vesa.org/faqs/
The missing nugget is related to copy protection ( HDCP v2.2 ). There are likely OS stack hooks that are required for that which may have been foggy as Fury went to design feature freeze. Similar to why not surprising no DP 1.3 is present also.
Hard to believe, all this time in the works and it comes crippled, or at least not up to the latest standards.
It was bad enough that DP was also not ready...
Since this is hardware implementations ( hooked into the basic circuit design) you don't get standards implementation the next month after standard goes final. Sometimes get things like "pre n" or "pre ac" Wifi because the difference is in the firmware or driver and not hardcoded in the circuits.
The devices downstream have a easier time with with HDCP v2.2 because they basically just consumers. Decode to throw on the screen is probably easier than encode and transmit. The transmit part is where may loose control of the content. Similar with "pass through" components which take HDMI in and send it right to HDMI out as a switch. Not a point of origin.
I could see all options with 4GB, but that would mean Apple going back at least in the D700 card, from 6GB to 4GB, which I believe in unlikely.
Unless they finally go to support for Crossfire on PCIe. Two nanos would add up to 8GB.
The top end "Nvidia killer" card is going to be the dual model in the standard card market.
But if the HBM design got them better uniformity , better thermal control, and packed into a smaller space .... I can see Apple temporarily backsliding on top end VRAM space. The fact that the core of the GPU is just a package now means the custom part of the board that Apple has to do is smaller. Plug-in AMD's package and to a large extent just have routing to the ports, to PCIe power , and power management to do. GPU core and memory interconnects you get "for free". ( quite similar to CPU/GPU/northbridge/sothbridge fusion with CPU packages. Plug in the package and lots of stuff is just done. )
I think 8hi is possible with HBM1 but it just isn't affordable versus the mainstream 8GB DDR5 cards. For "Pro video" cards 'affordable' isn't as much of a constraint. Like the 12 core CPU option that is priced in the "if have to ask you can't afford it" range... the 8GB option could be in the same boat. Not sure what AMD is the "FirePro" space with the generation 1 HBM designs. They probably won't appear until early 2016. That is about the time the Mac Pro would see Xeon E5 1600 v4 availability as well.
HBM2 (denser memory chips ) will crack the problem but the timing for that and an updated Mac Pro are way out of sync.