Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Have you tried FCPX with the latest NVIDIA web drivers? I saw a pretty massive improvement with those. Yes, NVIDIA still struggles with apps like Luxmark, but that's because the ray tracing algorithm was tuned for the AMD architecture.

http://barefeats.com/gtx980ti.html

That shows a massive improvement in CL performance, in at least the OceanWave benchmark.

Those still aren't stellar numbers compared to AMD, even with the down clocked Mac Pro cards. Nvidia has gotten better, not sure if it's good enough. 980 is looking similar to the D500, haven't looked at D700 numbers, and those are old cards by now.
 
  • Like
Reactions: Tucom
I am also very grateful to nvidia for the web drivers and how they support cards not available in actual macs.
 
Content creators want the latest and greatest Quadros for CUDA and the Mercury engine,
with solid application specific drivers and support, not two generation old AMD leftovers
stuffed in goofy trash cans that get updated slower than Apple Cinema displays.
 
  • Like
Reactions: tuxon86
Lol, wouldn't it be quite the coup if Apple buys Autodesk and makes it OSX-only overnight?
I would think that it wouldn't be a "coup" to cut off a big chunk of your customer base with no upgrade path. Sounds like a great way to "Osbourne" your market share.

If Apple had competitive workstations, it might not be a huge negative - but the MP6,1 is an expensive "weird puppy" compared to most professional systems.
 
  • Like
Reactions: tuxon86
Lol, wouldn't it be quite the coup if Apple buys Autodesk and makes it OSX-only overnight?
heh.. i'd be more worried about autodesk trying to buy osx




Content creators want the latest and greatest Quadros for CUDA and the Mercury engine,
with solid application specific drivers and support, not two generation old AMD leftovers
stuffed in goofy trash cans that get updated slower than Apple Cinema displays.


do you have a citing for that? or even some anecdotal evidence?

http://www.nvidia.com/object/cuda-in-action.html




.
 
Content creators want the latest and greatest Quadros for CUDA and the Mercury engine,
with solid application specific drivers and support, not two generation old AMD leftovers
stuffed in goofy trash cans that get updated slower than Apple Cinema displays.

I want decently priced workstation graphic cards without having to rent software applications. So I get the best of both worlds.
 
do you have a citing for that? or even some anecdotal evidence?


Why, oh why, oh why, would anyone try to present rational evidence to you?

Just a few nights ago you claimed that Tamper Resistant Security Torx Screws were (I'm not making this up, an actual word for word quote of yours) "the most user friendly standard screws available."

So, rather then admit "whoops, I was wrong" you went careening off into lala land and tried to show that since one could buy Tamper Resistant Security Torx drivers on Amazon, they weren't really meant to keep users out. So, by your "logic", the door locks on a car are also not meant to be considered impediments to entry, since you can buy a Slim Jim on Amazon as well.

http://www.amazon.com/Lockout-Opene...=1444891920&sr=8-1&keywords=slim+jim+car+door

The point here being that no matter what evidence might be brought forth to prove that computer users want powerful GPUs that work well, you would refuse to believe it and deny it's value since you have already formed an opinion, and closed your mind to anything not in line with your pre-judged conclusion.

So you can't be surprised if people don't bother to offer up real evidence, you have made it more then clear that you won't abide by the rules of logical discussion and polite debate.

When the 2014 Mac Mini came out many people were furious. Why? Because Apple had changed the bottom of the machine in a peculiar way. From the outside it looked the same, but they had removed the "twist to open" bottom that allowed past RAM and HD upgrades, and replaced it with a metal plate held in place by, you guessed it, "the most user friendly standard screws available." (Tamper Resistant Security Torx Screws)

Odd they were so upset about it. Perhaps you should explain their error to them? And Apple won. There are several people in that forum advising against the trouble and bother of upgrading the HD or adding an SSD. They just give up and connect an external USB drive, losing TRIM support and LOADS of speed, and adding clutter to their desk. All because they are afraid of "the most user friendly standard screws available."

If you want people to try and have actual constructive debates with you, you have to be able to admit when you are in error and keep your mind open.
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: avkdm and tuxon86
Koyoot had a clear and concise explanation for why Apple didn't go with nVidia in the now-closed "Will there be a new MacPro in 2015" thread.

https://forums.macrumors.com/thread...ac-pro-in-2015.1840458/page-120#post-22069344
https://forums.macrumors.com/thread...ac-pro-in-2015.1840458/page-120#post-22069894
https://forums.macrumors.com/thread...ac-pro-in-2015.1840458/page-120#post-22070547

I can't vouch for the veracity of it all, but it sounds plausible enough to my layman's ears.
Is he saying that AMD is great?
 
Is he saying that AMD is great?

Asynchronous compute is best thing since sliced bread. Will solve global warming, world hunger, etc.

He's a big fan of AMD. Good thing though, they could use one or two. While their cards are slower and generate loads more heat, they are more expensive and keep several old school coal fired power plants running.
 
Asynchronous compute is best thing since sliced bread. Will solve global warming, world hunger, etc.

He's a big fan of AMD. Good thing though, they could use one or two. While their cards are slower and generate loads more heat, they are more expensive and keep several old school coal fired power plants running.
Since Nvida is no longer useful according to "apple's standard," will you be moving to pc or remain with cMP?
 
AMD's days at Apple are numbered.

Those 105C iMacs will continue to die early deaths, AppleCare costs and end user reports on forums will be ugly. Hopefully they have the good sense to use the best product for their design goals. (Nvidia, obviously)

They gave up on CUDA, now they are quietly distancing themselves from AMD's only remaining advantage, OpenCl.

Apple will either accept where the professional workspace is, or leave it entirely. Perhaps their goal has been to become the darlings of Wedding Videographer/Caterers everywhere. They can print Menu's and Brochures in Apple software of some sort, shoot on their iPhone, and then Edit the Wonderous Event in FCPX/iMovie .

Why shoot for the top when there are so many more paying customers in the middle?
 
  • Like
Reactions: stevekr
Is he saying that AMD is great?
What I am saying is that I would rather want hardware that has, well... hardware support for features not only software one. Everything benefits Nvidia architecture is due to the "their made" software. CUDA, HSA architecture will be made through CUDA, Performance of games and other things - software drivers. That is because Nvidia made simple architecture to understand and program, but itself is not very much capable of.

AMD is on the other side. Asynchronous Compute - hardware support. HSA - hardware support. Virtualization of Applications - hardware level. The thing is: software is not using all of these capabilities. Yet. The future is long, however, and the world of low-level APIs changed a lot in developer minds.

P.S. If I am fan of any brand - yes I am Nvidia fan. Ive always had Nvidia GPU in my computer. But also I am as you can see fan of engineering and capabilities of electronic hardware, even if not yet used to 100%. And that is what I much more appreciate than any brand in the world.
 
What I am saying is that I would rather want hardware that has, well... hardware support for features not only software one. Everything benefits Nvidia architecture is due to the "their made" software. CUDA, HSA architecture will be made through CUDA, Performance of games and other things - software drivers. That is because Nvidia made simple architecture to understand and program, but itself is not very much capable of.

AMD is on the other side. Asynchronous Compute - hardware support. HSA - hardware support. Virtualization of Applications - hardware level. The thing is: software is not using all of these capabilities. Yet. The future is long, however, and the world of low-level APIs changed a lot in developer minds.

P.S. If I am fan of any brand - yes I am Nvidia fan. Ive always had Nvidia GPU in my computer. But also I am as you can see fan of engineering and capabilities of electronic hardware, even if not yet used to 100%. And that is what I much more appreciate than any brand in the world.


RIGHT NOW, as in Today, October 15, 2015 Nvidia GPUs are an order of magnitude better than the AMD chips used in current Apple products. Especially for the things important in skinny (artificially compacted for style considerations) and/or battery powered computers. This encompasses the entire Apple line.

So, they are instead offering the inferior/multiple generations old/hot running/power sucking GPUs that AMD couldn't give to a low rent PC builder. Slap a name on them that nobody recognizes and you can charge x5 what they bring as PC cards. The "D700s" nMP sound much more powerful then "7970" or it's second name "R9 280X" . Yep, third renaming really hit.

I noticed last night that my nMP can't just run standard AMD Windows drivers. If you do, they usually won't install. Instead, they have "Special" windows drivers, available from Apple in Bootcamp or from AMD. (Looks like on 2nd version now...woot!) These drivers give the wink to Apple and call device id "6798" a "Fire Pro D700" since all the standard AMD or FirePro drivers call that device id a "Radeon 7970".

So to maintain the image of having a FIrePro they have to hack their own drivers. With cMP, you just used standard drivers for things in Windows, didn't require a "Wink & Nod" driver to call a Quadro a Quadro.

And all of this "AMD drivers and cards enable all these great features coming next year" is remarkably similar to a Ponzi Scheme.

Apple started choosing AMD a couple years ago. We still haven't seen any reason, other then "Cheaper". When Godot brings us those AMD driver improvements that make all these years of suffering worthwhile, I'll be interested.
 
And those old AMD GPUs from Mac Pro are much faster in Final Cut Pro X than the Nvidia GPUs from the same segment.

If Grenada would be in Mac Pro it would wipe out in the same application Titan X. Only due to Asynchronous Compute that FCPX uses.
 
And those old AMD GPUs from Mac Pro are much faster in Final Cut Pro X than the Nvidia GPUs from the same segment.

If Grenada would be in Mac Pro it would wipe out in the same application Titan X. Only due to Asynchronous Compute that FCPX uses.


"Same Segment"?

Does that mean the old GTX680 from when nMP was announced ? See, that's the beauty, those old cmps aren't goofing around with old 680s or any of the tired old GPUs of the nMP era. The have moved past that old, tired architechture.
 
I noticed last night that my nMP can't just run standard AMD Windows drivers. If you do, they usually won't install. Instead, they have "Special" windows drivers, available from Apple in Bootcamp or from AMD.

Typically all workstation graphic card use specialized drivers. ( Firepro, Quattro ) Thats one of the main differences from the PC Desktop versions.

These drivers give the wink to Apple and call device id "6798" a "Fire Pro D700" since all the standard AMD or FirePro drivers call that device id a "Radeon 7970".

Makes sense they would use a different number identifier. Form factor & connector is different, with no fan installed.

So to maintain the image of having a FIrePro they have to hack their own drivers.

Nothing to maintain, thats because they are in fact Firepro cards as named directly from the AMD website.
 
It means that R9 280X is around 2 times faster(at the moment) in FCPX than GTX 970.

And R9 280X uses old Tahiti core that is in nMP in D700.

Congratz to NVIDIA for updating the drivers for FCPX. The problem is that GTX 980 Ti is still eaten in FCPX by Dual Tahiti config, regardless if we are talking about cMP, Hackintosh or nMP.

It is quite funny that people with Hacks and with R9 290X are getting slightly better results in 4K test in FCPX than people with GTX 980 Ti.
http://www.fcp.co/forum/hardware/18250-brucex-try-this-new-final-cut-pro-x-benchmark?start=280#62292
http://www.fcp.co/forum/hardware/18250-brucex-try-this-new-final-cut-pro-x-benchmark?start=280#62305
http://www.fcp.co/forum/hardware/18250-brucex-try-this-new-final-cut-pro-x-benchmark?start=300#63706
http://www.fcp.co/forum/hardware/18250-brucex-try-this-new-final-cut-pro-x-benchmark?start=300#66918
 
Last edited:
Have you tried FCPX with the latest NVIDIA web drivers? I saw a pretty massive improvement with those.

Won't really matter much if it occurs after the bake off has taken place. It is like coming out with a new performance tuning configuration for you Nascar the day after the Daytona 500. Nice job ... just not going to win the race that already happened.

The over disconnect not mentioned so far is resolution scaling ( Apple's "Retina" ) and dual DisplayPort coupling for the 4K ( since oversized from the norm 4K ) and 5K iMacs. A better, smoother (to Apple at the driver level) merged virtual display (e.g., EyeInfinity) and hiDPI scaling aligns well with a major Apple agenda that is not really aimed at latest , greatest gaming pixel drag racing scores.

Nvidia is loosing out across the Mac line up because of multiple factors.

Intel's GPUs are substantially better. So being squeeze out where internal space and/or cost is an issue in many Mac design bake offs. (i.e., the 21.5" iMac line up is no iGPU only. )

The trailing adoption attitude to OpenCL ( only work on it well after CUDA to give CUDA additional competitive advantage). Meanwhile AMD , Intel, and Imagination Tech all move to OpenCL updates largely in step with the OpenCL updates. "OpenCL has to loose" attitude is not going to help in bake offs.

It also probably hasn't helped that Nvidia is running around suing ARM+GPU SoC implementors
http://arstechnica.com/gadgets/2014/09/nvidia-sues-samsung-qualcomm-seeks-to-block-galaxy-s5-note-4/
....
http://anandtech.com/show/9713/samsungnvidia-case-update-us-itc-finds-samsung-gpus-noninfringing

When run around suing your potential customers for the other GPUs they happen to use .... somehow you probably don't make the Christmas card list at the end of the year. ( and it doesn't win bonus points in design bake offs either. )
[ I bet U of Wisc isn't going to get any extra Apple discount considerations any time soon either.
http://arstechnica.com/tech-policy/...nt-damage-claim-from-university-of-wisconsin/ ]

P.S. It doesn't mean that Nvidia will never win a design bake off in the future. But the notion that they "have to win" is fundamentally flawed. There are fewer slots and more competition. It is a three way race; not just two for fewer design bake offs. So they probably are not going to win as many as years past.
 
Last edited:
I would not rule out Apple developing PowerVR for Macs.

There may be a "break in case of emergency" side car R&D project to test feasibility but .....

Far more likely Apple is likely developing PowerVR to compete with Macs/OS X; not be used in it .... recent front page story with a quote from Schiller

"... The job of the iPad should be to be so powerful and capable that you never need a notebook. Like, Why do I need a notebook? I can add a keyboard! I can do all these things! The job of the notebook is to make it so you never need a desktop, right? It’s been doing this for a decade. So that leaves the poor desktop at the end of the line, What’s its job?” ..."
https://www.macrumors.com/2015/10/13/apples-input-design-lab-reveals/

Apple is coming after low end Macs with iPads and iOS.... not trying to limbo OS X down into iOS (and iPad ) space. They strategy is outlined rather plainly above.


The lower-upper end of the OS X laptops are trying for the space that Apple desktops used to sit in. PowerVR is not particularly optimized to try to jump up into the desktop space at all.

Apple is aiming desktops at being more powerful desktop (not desk side, in a rack, in a machine room ... actually sitting on a desk along with other things need on a desk. ). As long as Intel and AMD are not both failing delivering toward that goal (and Windows is still 7-9 times bigger than OS X . and iOS is even bigger still. ) there is no real huge motivation for trying to shoehorn PowerVR into that solution.
 
  • Like
Reactions: koyoot
Obviously, I'm not talking about PowerVR in its current form. What I'm saying is that I believe there's a good possibility that Apple is working on more powerful iterations of PowerVR that could at least start off in lower end Macs.

Knowing how much Apple likes thin devices and how hot AMD GPUs are, something has to give. PowerVR has made significant improvements since it first appeared on an iPhone. Now, Apple is even opening up AppleTV to apps and games. I don't think it will be long before a version of AppleTV could challenge X-Boxes and PlayStations in console gaming. Then, how long before PowerVRs begin appearing in Macs?
 
Makes sense they would use a different number identifier. Form factor & connector is different, with no fan installed.
If I recall correctly, it's the other way around:

Apples FirePro D700 has the same device ID as a PC HD 7970.
Real FirePros have a unique device ID.
 
  • Like
Reactions: tuxon86
Why, oh why, oh why, would anyone try to present rational evidence to you?

Just a few nights ago you claimed that Tamper Resistant Security Torx Screws were (I'm not making this up, an actual word for word quote of yours) "the most user friendly standard screws available."

So, rather then admit "whoops, I was wrong" you went careening off into lala land and tried to show that since one could buy Tamper Resistant Security Torx drivers on Amazon, they weren't really meant to keep users out. So, by your "logic", the door locks on a car are also not meant to be considered impediments to entry, since you can buy a Slim Jim on Amazon as well.

you using amazon as example of what i said shows that you're just reaching.
you can buy those things at walmart.. you can buy them at a grocery store..


it's 2015.. not 1970.
that a screw was rare and named 'security' 40 years ago doesn't mean much today..
50 years ago, the torx head was a 'security' screwhead.. times change.. people realized torx head offers plenty more advantages beyond security and the type has become standard.

(sidenote- in the same way i don't think you know what 'user friendly' means.. i also don't think you know what 'standard' means)

torx has another version of screw that's being used as their security head since the 90s(?).. it's non-standard and you don't walk into any old store to buy the tools to use them..
likewise, and i don't understand why you don't recognize this point-- apple themselves have designed a security screw.. it's used on nearly every one of their products.. not on mac pros


get off whatever it is you're on.. every single site / repair guide talks of how easy & friendly the mac pro is to work on compared to anything else apple is making.. (as well as easier than cmp)
why are you continuing to fight against that?
why so hard to realize or admit to yourself "hmm. maybe it isn't so hard to work on" ??
instead, you're continuing to argue that this thing is on lockdown by the evil overseers.
it's just nuts.


The point here being that no matter what evidence might be brought forth to prove that computer users want powerful GPUs that work well, you would refuse to believe it and deny it's value since you have already formed an opinion, and closed your mind to anything not in line with your pre-judged conclusion.
ridiculous.. bubba said
"
#28
Content creators want the latest and greatest Quadros for CUDA and the Mercury engine"

it's either a straight lie.
or he's sipping on your koolaid is all.
all your 'proof' about gpu this&that comes from gaming benchmarks.. you should really make that more clear because it could be seen as misleading the public as to why they should consider buying gpus from you.

So you can't be surprised if people don't bother to offer up real evidence
lol.
if you had any real evidence, you'd spray it in a heartbeat.. quit kidding yourself.

, you have made it more then clear that you won't abide by the rules of logical discussion and polite debate.
real rich dude.
you've been told countless times by members here how rude you are.. do you just ignore those posts?


When the 2014 Mac Mini came out many people were furious. Why? Because Apple had changed the bottom of the machine in a peculiar way. From the outside it looked the same, but they had removed the "twist to open" bottom that allowed past RAM and HD upgrades, and replaced it with a metal plate held in place by, you guessed it, "the most user friendly standard screws available." (Tamper Resistant Security Torx Screws)

when the nmp came out many people were furious. Why? Because apple put a freaking access latch on the shell and made it incredibly easy for people to get inside.. /s

what apple did to the mini is irrelevant unless your'e talking about all the other apple products as well.
mac pro is different than the rest.



If you want people to try and have actual constructive debates with you, you have to be able to admit when you are in error and keep your mind open.

ok.. i get it now.
it's mvc comedy hour.
great
 
Last edited:
Cores in FirePro GPUs are pretty much the same. They can differ by process which defines the deviceID. Its like the same story as Grenada/Hawaii. Grenada is the same core as Hawaii on new TSMC process. The thing is this: Hawaii directly on on that new process would not work, unless it would be a little tuned to adapt to it.


P.S. Guys, don't say that AMD GPUs are hot. iMac 5K GPU would go to 105 degrees regardless if there would be 125W AMD GPU or Nvidia. Its the design of a computer defines how fast GPU gets hot. Brand of the GPU has nothing to do here.
 
  • Like
Reactions: linuxcooldude
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.