Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think it is a bit of a gimmick, however, from business point of view - completely understandable. One part of computer breaks whole compatibility of new system.


If you want open platform - Linux or Windows, as it looks by today...
 
Yes, but an official GTX 680 Mac Edition was sold…

Not by Apple, they only shipped mobile versions. The drivers for 6 series will be around, but that's irrelevant because Kepler never shipped in a cMP. If Apple's default specs for 4,1 didn't support the new Shader Model and other graphics features, then the installer will simply say 'Your system is unsupported and installation cannot continue.' (etc).

And Nvidia will probably give up developing the web driver now that the cMP upgrades market is going to shrink. That's probably one of the reasons why there has been no progress in delivering finalised Maxwell optimizations since last August and why OpenCL bugs have been appearing. Nvidia saw this coming and didn't waste time trying to ship official Maxwell support otherwise people would be buying 9 series/Quadro M6000 cards for 4,1 systems and then discover Apple is abandoning them anyway. It would have been very unfair.
 
What makes you say that? Cheaper, sure, but NVIDIA hasn't released their direct competitor at that price point yet. Kind of sad that the 1070 has the same power draw (150W) as the RX480 but has much higher performance.

Much cheaper price and the spec and performance is way better than that base on the test duh. 4gb version cost only $199 .

 
Screen Shot 2016-06-13 at 9.14.50 PM.png
 
Like I said in my post yesterday above, if they drop the graphics driver for those models then by default the system won't display graphics during or after a vanilla installation. So Apple will streamline the Nvidia drivers and block out systems that report 4,1 (probably even with the 5,1 upgrade). You would have to hack the installer to get around it and inject kexts if possible. Basically, hackintosh.

Why do you say so many things that aren't fact, but simply your opinion?

Ex Member MacVid Cards has a 4,1 cMP flashed to a 5,1 has successfully installed macOS Sierra.

TinyGrab Screen Shot 6-13-16, 7.32.45 PM.png



Lou
 
Last edited:
Much cheaper price and the spec and performance is way better than that base on the test duh. 4gb version cost only $199 .


Right, so NVIDIA will come out with their $200 card that will probably use 100W or less, and will likely be faster than the AMD card. AMD revealed their hand, and now NVIDIA can just make sure their product at that price segment is a clear winner. Meanwhile, they continue to dominate at the high end with the 1080 and 1070. I don't see how this is bad for NVIDIA at all, they clearly have a wide range of products based on Pascal coming out soon. The 1070 is not supposed to be a direct competitor to the RX 480, it's just sad that it uses the same power while delivering nearly double the performance. I know which architecture I'll be going with this generation.
[doublepost=1465886580][/doublepost]
Not by Apple, they only shipped mobile versions. The drivers for 6 series will be around, but that's irrelevant because Kepler never shipped in a cMP. If Apple's default specs for 4,1 didn't support the new Shader Model and other graphics features, then the installer will simply say 'Your system is unsupported and installation cannot continue.' (etc).

And Nvidia will probably give up developing the web driver now that the cMP upgrades market is going to shrink. That's probably one of the reasons why there has been no progress in delivering finalised Maxwell optimizations since last August and why OpenCL bugs have been appearing. Nvidia saw this coming and didn't waste time trying to ship official Maxwell support otherwise people would be buying 9 series/Quadro M6000 cards for 4,1 systems and then discover Apple is abandoning them anyway. It would have been very unfair.

The GTX 680 card works with the stock Apple drivers that shipped with the 2012/2013 laptops and iMacs, so there's no issue with that card not working. The rest of this doesn't even deserve a response, you are entitled to your opinions.
 
Right, so NVIDIA will come out with their $200 card that will probably use 100W or less, and will likely be faster than the AMD card. AMD revealed their hand, and now NVIDIA can just make sure their product at that price segment is a clear winner. Meanwhile, they continue to dominate at the high end with the 1080 and 1070. I don't see how this is bad for NVIDIA at all, they clearly have a wide range of products based on Pascal coming out soon. The 1070 is not supposed to be a direct competitor to the RX 480, it's just sad that it uses the same power while delivering nearly double the performance. I know which architecture I'll be going with this generation.
[doublepost=1465886580][/doublepost]

The GTX 680 card works with the stock Apple drivers that shipped with the 2012/2013 laptops and iMacs, so there's no issue with that card not working. The rest of this doesn't even deserve a response, you are entitled to your opinions.
The GTX 1060 would have to be 80% of the GTX 1070 in performance to compete with Polaris. 470 and 460. Those are competitors to GP106.

Secondly. Pastrychef brought a slide from May conference where AMD shown the P10 GPU. It clearly says that with DUAL GPU setup you get better framerate AND efficiency than single GTX 1080. Already the rumored power consumption of top-end Polaris 10 is around 90-100W. So no. Nvidia will not dominate in efficiency.
 
Last edited:
AMD claimed it to be, but at the expense of a PCI-e slot... Also, based on the link provided by netkas, the RX480's performance will be on par with the GTX 980. With the price point that they intend to bring this to market, they will probably have a hit on their hands unless Nvidia does some massive price cutting.

At $200, I'd be willing to take a risk with an RX480 even with all the failures I'd seen from AMD/ATI cards in the past. If I manage to squeeze two years of use out of a $200 card, I think it would be a bargain.
 
The GTX 1060 would have to be 80% of the GTX 1070 in performance to compete with Polaris. 470 and 460. Those are competitors to GP106.

Secondly. Pastrychef brought a slide from May conference where AMD shown the P10 GPU. It clearly says that with DUAL GPU setup you get better framerate AND efficiency than single GTX 1080. Already the rumored power consumption of top-end Polaris 10 is around 90-100W. So no. Nvidia will not dominate in efficiency.

An AMD slide said they had better efficiency, so it must be true, right? They can match a 1080 @ 180W with 2 x 480 @ 300W total, so they must have better efficiency, right? Really not sure why you think they can compete on efficiency, NVIDIA has had a much more efficient architecture for several generations now while AMD just picked up some power improvements from the new process. We'll see when GP107 and GP106 get announced and released, but I just don't believe that NVIDIA will lose this generation when they have 150% or higher perf/watt of the AMD architecture.
[doublepost=1465918004][/doublepost]
AMD claimed it to be, but at the expense of a PCI-e slot... Also, based on the link provided by netkas, the RX480's performance will be on par with the GTX 980. With the price point that they intend to bring this to market, they will probably have a hit on their hands unless Nvidia does some massive price cutting.

At $200, I'd be willing to take a risk with an RX480 even with all the failures I'd seen from AMD/ATI cards in the past. If I manage to squeeze two years of use out of a $200 card, I think it would be a bargain.

Or just wait a month or two for the $200 GPU from NVIDIA? All signs point to them having a vastly superior architecture this generation.
 
An AMD slide said they had better efficiency, so it must be true, right? They can match a 1080 @ 180W with 2 x 480 @ 300W total, so they must have better efficiency, right? Really not sure why you think they can compete on efficiency, NVIDIA has had a much more efficient architecture for several generations now while AMD just picked up some power improvements from the new process. We'll see when GP107 and GP106 get announced and released, but I just don't believe that NVIDIA will lose this generation when they have 150% or higher perf/watt of the AMD architecture.
[doublepost=1465918004][/doublepost]

Or just wait a month or two for the $200 GPU from NVIDIA? All signs point to them having a vastly superior architecture this generation.
RX 480 is 150W TDP with max board power from connector and PCIex. The GPU does not draw 150W. It draws between 90-100W in gaming load. So yes, dual RX 480 might have better efficiency and price and performance than single GTX 1080.
 
What does it mean "2xRX480" ? Crossfire ? If it is so, then they should never use this in a benchmark table, even if it seems to be more efficient or whatever. It's totally dishonest. Crossfire is a hit and miss at best.
 
Crossfire was the way for DX11 for using dual GPU setups. DX12, and what is more important Metal will use Split Frame Rendering(because it is part of D3D12) for dual GPU setups. GPU will render just half of the scene displayed, because the second GPU will do the same thing with missing part from scene.

In short: it allows you application to see both GPUs as one big GPU unit. It also allows for use of two different vendor GPUs in single application, without clashing together. This is software side.

About the benchmark. Yes, it was a bit dishonest.
 
True, I've also read before that DX12 will split the frame on both GPUs instead of the traditional round-robin. However, round-robin or SFR, is it going to be the same hit-and-miss experience regarding how it will be supported on a per-application basis, as Xfire was ? Cause if that's the case, I had a really awful experience from Xfire on the nMP. It was nearly useless, in all honesty. So, in that sense, regarding its very small real-life usage and its failure ratio, I'd never include it as part of any benchmark, even if that said benchmark manages to utilize it 100%.

FWIW, they could just put 10x RX480 against 5x 1080s and widen the performance/price difference even more. It would be almost equally unrealistic.
 
SFR is hardware agnostic. Every application that uses D3D12 or Vulkan might use this feature.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.