Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Because it is an AMD sponsored game silly. It was made first and foremost for the consoles which are AMD optimized and then ported to PC. And beside marketing blurbs you aren't really bringing anything in any GPU discussion. You are trying to pass yourself as an insider but never achieve anything more than displaying silly fanboyism.

If I displease you, you're free to put me on ignore. But keep in mind that I'll still contradict your propaganda.
Quantum Break is not AMD sponsored game. Nvidia cannot do wrong, cannot create bad hardware.
Why not ask Nvidia why they did not created better hardware for DX12? Why do not ask Nvidia why their compute power of GPUs is lacking so much? Why not ask Nvidia: why their architecture is so bandwidth starved that you will sooner or later be bottlenecked?

There is a slight difference my friend between our behavior. I always wanted to understand what is going on with GPUs, their technicalities. You always wanted to confirm your own expectation about both brands.

That is very reason why you attack me, not the message I bring. That is why you ignore every single link that explains everything, why GPUs currently behave like they behave, saying they are made by anonymous people. You do not look for knowledge, but for confirmation of your biases. Try to disprove them by bringing counter arguments. Not downplay the arguments, because it shows that you are unable to discuss in civilized manner.

If I would want to confirm my expectation, I would do the same thing as you. But I don't. Funniest part about is that in my analysis of Pascal architecture in Mac Pro 2016 I have made an error. But none of you who were in that thread spotted it, and did not corrected it. Why? Because it confirmed your expectations. Nothing else. I have spotted the error, and I understand it. I have fallen once again into Nvidia's Reality Distortion Field.

P.S. You did not contradict anyones propaganda. You are only able to downplay my arguments, which are valid. If not, counter them. With understanding of both architectures. Or maybe you can't, because everything I have posted was true?

Why aren't TITAN results on there? A 6GB 980 is certainly not Nvidia's best hardware.
That GTX 980 is not GTX 980, but GTX 980 Ti ;). Cut down version of Nvidia's GM200 that is the base of Titan X. 980 Ti has 2816 CUDA cores, and GTX 980 does not come with 6 GB of VRAM because it has only 256 Bit memory bus, thats why their amount of VRAM is possible only in 4 and 8 GB configs. GTX 980 also has 2048 CUDA cores, which is slightly lower than GTX 980 Ti's ;).

I don't know why PClab did not tested the TX. Maybe they did not have it available at the times of testing?
 
Last edited:
Quantum Break is not AMD sponsored game. Nvidia cannot do wrong, cannot create bad hardware.
Why not ask Nvidia why they did not created better hardware for DX12? Why do not ask Nvidia why their compute power of GPUs is lacking so much? Why not ask Nvidia: why their architecture is so bandwidth starved that you will sooner or later be bottlenecked?

There is a slight difference my friend between our behavior. I always wanted to understand what is going on with GPUs, their technicalities. You always wanted to confirm your own expectation about both brands.

That is very reason why you attack me, not the message I bring. That is why you ignore every single link that explains everything, why GPUs currently behave like they behave, saying they are made by anonymous people. You do not look for knowledge, but for confirmation of your biases. Try to disprove them by bringing counter arguments. Not downplay the arguments, because it shows that you are unable to discuss in civilized manner.

If I would want to confirm my expectation, I would do the same thing as you. But I don't. Funniest part about is that in my analysis of Pascal architecture in Mac Pro 2016 I have made an error. But none of you who were in that thread spotted it, and did not corrected it. Why? Because it confirmed your expectations. Nothing else. I have spotted the error, and I understand it. I have fallen once again into Nvidia's Reality Distortion Field.

P.S. You did not contradict anyones propaganda. You are only able to downplay my arguments, which are valid. If not, counter them. With understanding of both architectures. Or maybe you can't, because everything I have posted was true?


That GTX 980 is not GTX 980, but GTX 980 Ti ;). Cut down version of Nvidia's GM200 that is the base of Titan X. 980 Ti has 2816 CUDA cores, and GTX 980 does not come with 6 GB of VRAM because it has only 256 Bit memory bus, thats why their amount of VRAM is possible only in 4 and 8 GB configs. GTX 980 also has 2048 CUDA cores, which is slightly lower than GTX 980 Ti's ;).

I don't know why PClab did not tested the TX. Maybe they did not have it available at the times of testing?

Have you given any thought about consulting for your paranoia problem?

You're an AMD fanboy at best or an undeclared paid shill at worst, we get it believe me, we do get it.
Your sources are most of the time blog post or forum post from Anantech or other tech rumor mills. This is the main reason why I'm ignoring them. When their not forum or blog post, they're marketing blurb from AMD...

Find a new hobby.
 
  • Like
Reactions: 952863
Have you given any thought about consulting for your paranoia problem?

You're an AMD fanboy at best or an undeclared paid shill at worst, we get it believe me, we do get it.
Your sources are most of the time blog post or forum post from Anantech or other tech rumor mills. This is the main reason why I'm ignoring them. When their not forum or blog post, they're marketing blurb from AMD...

Find a new hobby.
Then I have permission to completely ignore your point of view, about them. Because you refuse to read them, acknowledge them, or even understand them. Anyways I have been proven right. Again. Thank you tuxon.
 
I am thinking it is because AMD is ready for directx12 which almost the same thing of apple side "Metal".
recently i have tested the ashes of the singularity which is the first fundamentally directx12 supported game with my m395x. i figure out my card gain around 15% improvement on directx12 which almost 80% more frame rate compare with GTX 970M and 10% below 980m. Would likely to see more performance gains on directx12 by driver update later. Therefore i don't think apple will shift to nvidia on the coming update.

AMD has hard stronger hardware than Nvidia for some time now, especially in terms of GPU compute.

Apple know the future lies with GPGPU stuff, and this is why they've gone AMD so much recently. CUDA is a proprietary platform, and limits their options. So whilst yes, currently people with CUDA apps might not like it, openCL, as a non-hardware-vendor specific API and the best hardware to run OpenCL (Currently AMD, intel GPUs can run, may change, open standard) is the way forward.
 
AMD has hard stronger hardware than Nvidia for some time now, especially in terms of GPU compute.

Apple know the future lies with GPGPU stuff, and this is why they've gone AMD so much recently. CUDA is a proprietary platform, and limits their options. So whilst yes, currently people with CUDA apps might not like it, openCL, as a non-hardware-vendor specific API and the best hardware to run OpenCL (Currently AMD, intel GPUs can run, may change, open standard) is the way forward.
More important in this context is HSA, and the way how Intel, Imagination and AMD handle it. Nvidia handles it through CUDA, and software. It is called software scheduling(static scheduling) that brings load to CPU, because its the CPU that dispatches commands. Intel iGPUs do not have to have Hardware Scheduler, because they have one integrated on the die: CPU itself. It is slightly different situation than Nvidia. AMD starting from Fiji has Hardware Scheduler on their GPUs. This thing offloads CPU, and allows in future the ability to execute tasks completely out of order. However it is not possible with current iterations of GPUs.
 
  • Like
Reactions: Maxx Power
I think the more important migration is the migration from a proprietary API (CUDA) to OpenCL (AMD instead of NV is a side-effect).

That's the change. The fact that AMD hardware is being used at the moment is a side-effect of it running OpenCL better.

Apple want the flexibility to move, and CUDA is not flexible.

In the future, OpenCL allows them to run the same code on Intel, AMD, ARM, Nvidia, or whatever other hardware that comes out that supports OpenCL.
 
Not exactly. Compute APIs expose compute performance for applications. Thats why comparing 6.1 TFLOPs(R9 390X) to 4.2 TFLOPs GPU(GTX 980) is absolutely pointless from this point of view. It has nothing to do with what architecture is better suited for particular API, but how much compute power hardware has.

It is exact very reason why we see in DX12 that 6.1 TFLOPs GPU(R9 390X) ties with 6.1 TFLOPs GPU(GTX 980 Ti).

But overall you are correct ;). We have to still remember also HSA initiative, and what part plays in it OpenCL 2.1. We can mock about Apple being proprietary, but having proprietary whole platform is better solution than to have one part of it, that whole platform has no control of. And Metal gives possibility that it will be Hardware agnostic, and best hardware will always win. Regardless if it is Nvidia, AMD, Intel Imagination, or... Apple themselves.
 
I think the more important migration is the migration from a proprietary API (CUDA) to OpenCL (AMD instead of NV is a side-effect).

That's the change. The fact that AMD hardware is being used at the moment is a side-effect of it running OpenCL better.

Apple want the flexibility to move, and CUDA is not flexible.

In the future, OpenCL allows them to run the same code on Intel, AMD, ARM, Nvidia, or whatever other hardware that comes out that supports OpenCL.

Nvidia has been on a roll, promoting less flexible standards. The following is/were true at some point: G-sync, CUDA, Nvidia's Linux Drivers, GameWorks, PhysX, etc. Apple did not jump on any of this bandwagon. At some point of time, Apple also decided to not adopt Optimus as well. Since Apple develops its own ARM SOCs, it also does not leverage Nvidia for its Tegra line of products. Perhaps the relationship between Nvidia and Apple has been souring for a long time.
 
  • Like
Reactions: throAU
Apple isn't actually that proprietary. Sure the OS is proprietary but most of their core APIs are not, the storage generally is not, etc.
Nvidia has been on a roll, promoting less flexible standards. The following is/were true at some point: G-sync, CUDA, Nvidia's Linux Drivers, GameWorks, PhysX, etc. Apple did not jump on any of this bandwagon. At some point of time, Apple also decided to not adopt Optimus as well. Since Apple develops its own ARM SOCs, it also does not leverage Nvidia for its Tegra line of products. Perhaps the relationship between Nvidia and Apple has been souring for a long time.

Well, i dunno if the relationship has soured, Nvidia just aren't making products in line with Apple's overall product strategy.

if NV hardware performed better on the open standard software Apple wanted to promote i'm sure they'd consider running it.

Also.. i think Apple has been deliberately pushing AMD for the past few years in order to kill CUDA in favour of openCL.

Yes, i know some people have/love CUDA apps, but for the long term strategy (OpenCL succeeds, all apps support it), CUDA on Apple needs to die. If they supplied NV hardware no one would bother porting their apps, the software would remain proprietary and locked to Nvidia hardware. Putting apple at Nvidia's mercy.
 
  • Like
Reactions: Maxx Power
Apple isn't actually that proprietary. Sure the OS is proprietary but most of their core APIs are not, the storage generally is not, etc.


Well, i dunno if the relationship has soured, Nvidia just aren't making products in line with Apple's overall product strategy.

if NV hardware performed better on the open standard software Apple wanted to promote i'm sure they'd consider running it.

Also.. i think Apple has been deliberately pushing AMD for the past few years in order to kill CUDA in favour of openCL.

Yes, i know some people have/love CUDA apps, but for the long term strategy (OpenCL succeeds, all apps support it), CUDA on Apple needs to die. If they supplied NV hardware no one would bother porting their apps, the software would remain proprietary and locked to Nvidia hardware. Putting apple at Nvidia's mercy.

Nvidia's public behaviors are also not quite aligned with the public image that Apple wants to portray. Nvidia behaves as if it wants to win at any cost while Apple has a strong backbone and will not be forced to be reactionary. I do recall that recently, Apple cut off a dozen or so suppliers for conflict minerals after the suppliers will not or could not ascertain that their profits are funding militarized violence. Apple has also cut off suppliers over child labor practices and such.
 
Last edited:
Why aren't TITAN results on there? A 6GB 980 is certainly not Nvidia's best hardware.

NVidia's comparison chart doesn't show that much difference between them, reviews have been mixed on which one is the better card, 12 GB of VRAM vs 6 GB of VRAM doesn't determine real world performance as I don't know of any games that would actually use 6 GB of VRAM (there may be some). The only time you'd actually get close to using 12 GB is if you are doing 3D rendering in a CAD program.
 
The Titans are an odd case. They don't target the high professional CAD/3D/Scientific market that is more a Quadro thing, and they're priced out of reach for the vast majority of gamers. I look at them as semi-pro GPU for CAD/3D/Scientific students, hobbyist or mom & pop shop that don't need Quadro but do need more than what a GTX 980Ti can give them.
 
  • Like
Reactions: mildocjr
5k is... kinda pointless in a word where we haven't even got 4k gaming and video streaming to work yet. I could get a £600 5K monitor for my £700(ish) PC today if I wanted (Dell UP2715K) but what is the point? Why bother? Apple's 5K thing is nothing more than a gimmick in a world that isn't even fully prepared for 4K.

So let me ask you a question, where is all this 5K content hiding that makes a 5K iMac so damn desirable? And other than displaying stuff at a 5K resolution, can the iMac play Fallout 4 at 5K or make some heavy edits to a 5K video file? I already know the answer is no.

You have an iPhone to consume content. There's also an iPad Mini, iPad Air, and iPad Pro to consume more content. You think the 5K iMac was built to consume more content and for people to game on?

People use their iMacs for actual work where a 5K monitor boosts productivity.

Link me to a £600 Dell 5K monitor. Oh btw, the dell isn't even SST.

EDIT: I see that you linked to the UK site. Unfortunately for most of us, we can't get it at that price.

If I'm unable to ever understand it (because I play games, some great logic there), why bother explaining it to me? Even though I have acknowledged above that it is useful for photos. Or did I? Because according to you, I'll never understand it since all I do with my life is play video games. :rolleyes:

Was going to leave it there, but sod it, I'll bite. Instead of buying a £600 5K monitor, lets just say I've spent £2K on a 5K iMac. It can do huge photos. Great. Now what else have you got for me? Because £2K is a lot of cash to spend on an i5 machine with mediocre graphics.

I already know the answer and it's nothing. No 5K games, no 5K videos, no real 5K content. And by the time 5 or 4K becomes the norm, the hardware in that iMac will no doubt be dated an unable to handle it. Other than photos, 5K is a gimmick. A gimmick that Apple are pricing high and having running on hardware that is okay at best.

But hell, all I do is play video games according to you, so what the hell do I know?

It's clear that you consume content, rather than produce it. That's why 5K doesn't make sense to you. I'll never understand why gamers write off Macs as pointless when Apple has never produced a desktop gaming machine. Build a gaming windows machine.
 
Last edited:
It's clear that you consume content, rather than produce it. That's why 5K doesn't make sense to you.
Herp derp complete nonsense. I write books and aid in the coding of a few indie games. Also, it is clear you haven't read all my posts on the topic here. 5K, currently, is a gimmick, useful for nothing but photos at the present time. I'd love to see you attempt to make a 5K movie or make a 5K game on the iMac. With its underpowered hardware, it'd crumble.

Tell us all, what 5K content are you producing? And don't say photos.
 
Last edited:
  • Like
Reactions: AsprineTm
Herp derp complete nonsense. I write books and aid in the coding of a few indie games. Also, it is clear you haven't read all my posts on the topic here. 5K, currently, is a gimmick, useful for nothing but photos at the present time. I'd love to see you attempt to make a 5K movie or make a 5K game on the iMac. With its underpowered hardware, it'd crumble.

Tell us all, what 5K content are you producing? And don't say photos.
I thought the point was that you can edit 4K video at 100% pixel-for-pixel while still being able to have room for the editing tool windows. Not to mention the fact that "retina" is simply better looking and easier on the eyes.

Apple offering 5K for the same price as the old iMacs is pretty amazing, IMO.
 
Apple offering 5K for the same price as the old iMacs is pretty amazing, IMO.
Indeed and in fact, a 5k Dell monitor costs as much as a complete iMac. I may not need 5k (or even 4k), but given what the 27" iMac provides, its a great buy imo.
 
On the topic of nvidia: I don't know how a consumer/professional/prosumer/whatever can support nvidia's proprietary standards. Their anti-competitive and just leaves you at the mercy of nvidia.
 
Herp derp complete nonsense. I write books and aid in the coding of a few indie games. Also, it is clear you haven't read all my posts on the topic here. 5K, currently, is a gimmick, useful for nothing but photos at the present time. I'd love to see you attempt to make a 5K movie or make a 5K game on the iMac. With its underpowered hardware, it'd crumble.

Tell us all, what 5K content are you producing? And don't say photos.

Derp derp, it seems like you game more than doing actual work. No one is creating 5K content, lol. They use the extra screen real estate when creating 4K content. Derp derp
 
AMD is inferior and cheaper.........

AMD has the more advanced process and better power requirements/heat than NV.

Also, hitching your wagon too firmly to just one maker puts you at their mercy. Look at how CPU prices plummeted when AMD became competitive with Intel.

Both NVidia and ATI/AMD have released crap. But for some reason, no one calls Nvidia on it until the replacement arrives for it.

AMD is fine for what the Mac is designed for, which isn't the latest FPS game.
 
There have been AMD "chips" in Apple computers for over 20 years (probably 30 if I researched instead of just relying off what I've seen in person).

Let's not be surprised that not much has changed.
 
AMD is inferior and cheaper.........

AMD hardware is actually superior in many respects. NV hardware is better at dx11 and runs CUDA. Dx11 is irrelevant moving forward and CUDA is a proprietary closed standard that Apple do not want.

And before you go comparing the recent rx480 to Titans and gtx1080s. It is not intended to be a high end card. In raw compute performance it slaughters the 1060 at lower cost.
 
Last edited:
There have been AMD "chips" in Apple computers for over 20 years (probably 30 if I researched instead of just relying off what I've seen in person).

Let's not be surprised that not much has changed.

Even before AMD bought ATi, Apple has favored the ATi hardware over 3dfx and nVidia. Apple started using the Rage 3D II and Rage Pro cards in the 3G PowerMacs and the 20th Anniversary Mac. There were a couple earlier models like the 9600 that also used Twin Turbo cards, but that's about it. I think the first nVidia board Apple offered was the GeForce 2 MX, and in many cases when you could get an nVidia board, it was a BTO upgrade rather than standard (unless it was in the iMac or the like).
 
  • Like
Reactions: cynics
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.