Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But it could be a mistake. They list the Mac version as having Direct X 11 support, but that doesn't exist on a Mac.

NVidia tends to do that, but technically it's not irrelevant as it would do DirectX 11 under Boot Camp.
 
Open CL support is on the press release page. It is odd. Maybe only with 10.8. Hope 10.7.4 will be supported. For that matter 10.6.8. But probably not. Nvidia reserves the right change anything and maybe not even release. Looking forward to updated consumer drivers for 670/680. Or just maybe an official Apple card to pay too much for. But hey PCI 3 and boot screen!
 
But it could be a mistake. They list the Mac version as having Direct X 11 support, but that doesn't exist on a Mac.

Maybe. But the DirectX 11 isn't really a mistake. That is a reference to using the card with Bootcamp. (probably should have an asterisk next to it since bootcamp isn't on that page). Again I wouldn't expect a full range of support outside of the standard Bootcamp + Windows configuration either.

----------

Open CL support is on the press release page. It is odd.

Yes. But you would think that a product that has gone through extended validation and configuration testing would have a correct specs page.


Or just maybe an official Apple card to pay too much for.

The other interesting thing on the press release is that Apple isn't one of the sellers. Or at least one of the system integrators. ( perhaps it will show up in the online store from appropriate distributor in the appropriate country. )
 
All I'm saying is I don't think too much should be read from those spec sheets. Why state Direct X for the Mac card and not 3D Vision as well then ?
 
This is simply not true. The GT 640 is a GK107-based card. The GTX 680 and Quadro K5000 are GK104-based cards.

My source states different. The K5000 is GK107 based, where as the 680 is GK104. Source could be wrong, but have you guys though of checking how the GT640/630 (GDDR5 only please) performs on OpenCL/CUDA ? And then interpolating it's performance with the number of cuda cores the K5000 would do? I think nVidia is using the GK107 arch for the low end efficient cards and the pro cards - and the gk104 for gaming without GPGPU performance - so there's no chance of "making" a 680 perform better than a quadro card.
 
My source states different. The K5000 is GK107 based, where as the 680 is GK104. Source could be wrong, but have you guys though of checking how the GT640/630 (GDDR5 only please) performs on OpenCL/CUDA ? And then interpolating it's performance with the number of cuda cores the K5000 would do? I think nVidia is using the GK107 arch for the low end efficient cards and the pro cards - and the gk104 for gaming without GPGPU performance - so there's no chance of "making" a 680 perform better than a quadro card.


Not on Windows. The Quadro K5000 is GK104, but die harvested. So it's slower and has fewer functional units. It's the flagship Quadro at the moment.

The GK107 is even slower at compute than GK104 (being the chip used in the Geforce 640). Until GK110 comes out later this year GK104 will be nvidia's fastest card.
 
My source states different. The K5000 is GK107 based, where as the 680 is GK104. Source could be wrong, but have you guys though of checking how the GT640/630 (GDDR5 only please) performs on OpenCL/CUDA ? And then interpolating it's performance with the number of cuda cores the K5000 would do? I think nVidia is using the GK107 arch for the low end efficient cards and the pro cards - and the gk104 for gaming without GPGPU performance - so there's no chance of "making" a 680 perform better than a quadro card.

Your sources are just wrong.

http://www.anandtech.com/show/6140/...erbased-quadro-k5000-secondgeneration-maximus

The Quadro K5000 is the Kepler-based successor to NVIDIA’s existing Fermi-based Quadro 5000 video card. Based on the same GK104 GPU as NVIDIA’s Tesla K10 and GeForce GTX 680, the card is in many ways the near-obligatory workstation version of NVIDIA’s existing consumer products.

The Quadro and GeForce markets are different, and both can be served by a GK104-based GPU.
 
Tesla C2050 and C2075 have had support in the OS since 10.6.8

A C2075 is for all intents and purposes a Quadro 6000 with fewer output ports.

I imagine same will hold true of Kepler cards. SOmeone just needs to try it out.
 
The GPGPU is disabled on Kepler cards in a much more permanent & effective way then in past architectures.

It is quite likely that the CUDA and OpenCl performance of the GTX cards will ALWAYS be much lower than the Quadro cards. The GTX570 and GTX580 do not suffer this same fate, they are able to SLAUGHTER their Fermi based Quadro counterparts.

Have a look at CS6 performance, Engadget claims a "twice as fast in CS6" than Q4000, this is already true of GTX570, even the Quadro 6000 we sent Barefeats gives the Q4000 a good trouncing.

http://www.barefeats.com/rogue01.html
I have a very compute-heavy OpenCL project.

My AMD 5870 blows the Quadro 4000 completely out of the water; big surprise. And that was with my code tuned to the architecture nV uses in the 330m in my laptop!

But, I'm still getting a much lower percentage of theoretical max FLOPs with the 5870 than I am with the 330m....

i.e. 330m LuxMark about 60; 5870 LuxMark about 600, but I get a bit over 4x the throughput on the 5870 (not 10x). And that's with heavy tuning incl. float4 vectorization to take advantage of 5870's VLIW.

MVC, I've followed your "Break through the gloom!" thread, and am about to buy something, probably nV, to increase throughput. If the 570/580 Luxes out at about 1000, and I get a higher percentage of theo-flops, then that will be the ticket to the level of performance I've been seeking. I don't see a 570/580 on your site, and don't care about boot screen, and might wanna stick with Lion for now, but I may buy from you after all. Good things on the horizon....
 
I have a very compute-heavy OpenCL project.

My AMD 5870 blows the Quadro 4000 completely out of the water; big surprise. And that was with my code tuned to the architecture nV uses in the 330m in my laptop!

AMD has at times shown better numbers on OpenCL based processes under OSX. In terms of the Quadro 4000, it doesn't technically support OpenCL under OSX, although there may be a workaround. It's limited to CUDA and whatever OpenGL revision. This is something I've mentioned before. You know what you need as you've researched the options prior to dropping that much on a card. Unfortunately under OSX, details aren't always readily printed in the specifications when you go to order one. It's a big choice under OSX as you typically pay just as much for the card with fewer features available, so you want to know exactly how it benefits you prior to placing an order.


Spot on. Just sometimes they get ruled out by people that haven't seen the difference and can't afford to investigate the benefits properly.


I agree there too. I think it's the kind of thing that is researched as they often provide sizable gains, yet it's not an absolute rule. There are a lot of weird ideas about gpus that circulate. People get confused on what is or isn't accelerated or supported by them at times.
 
Last edited:
In terms of the Quadro 4000, it doesn't technically support OpenCL under OSX, although there may be a workaround. It's limited to CUDA and whatever OpenGL revision. This is something I've mentioned before. ...................... There are a lot of weird ideas about gpus that circulate. People get confused on what is or isn't accelerated or supported by them at times.

You are creating a "weird idea" yourself in saying that OpenCl doesn't work on Q4000. 100% working in ML, no hacks, cracks, or edits needed.
 
You are creating a "weird idea" yourself in saying that OpenCl doesn't work on Q4000. 100% working in ML, no hacks, cracks, or edits needed.

Huh? It wasn't officially supported under Lion. That's cool that they addressed that. Also my bad. I didn't know they finally addressed that.
 
Huh? It wasn't officially supported under Lion. That's cool that they addressed that. Also my bad. I didn't know they finally addressed that.
I may have made a bad here too. I may have mis-remembered the number on the Quadra card at work I tried it on (I'm not there now). But, we had grant money to spend and absolutely maxed out a mac pro; the video card was at least two grand, but that was several years ago.
 
So, in a nutshell, this is the only way to get a 'blessed' GeForce 680 LOL. It's technically the same guts inside, minus a few additional bits with regards colour output and frame buffer, right?

So would one assume to expect 680 performance from this card for everything most people use GPU's for?
 
Tesla C2050 and C2075 have had support in the OS since 10.6.8

A C2075 is for all intents and purposes a Quadro 6000 with fewer output ports.

I imagine same will hold true of Kepler cards. SOmeone just needs to try it out.

There's support, and there's Support. My impression has been that none of the Tesla cards are officially backed by NVIDIA. For my purposes, that's kind of a problem.

I'll all for putting a hot-rod, non-official video card in my home machine (and indeed will probably do so soon) but for work, it needs officialness.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.