GTX 1080 compute performance: http://www.overclock.net/t/1600173/compubench-gtx-1080-and-gtx-980ti-compute-performance
GTX 1080 compute performance: http://www.overclock.net/t/1600173/compubench-gtx-1080-and-gtx-980ti-compute-performance
You won't see that on the cMP. Older generation Nvidia cards score higher in Windows benchmarks by as much as 30% Pascal now has proper support for Asynchronous Compute, but without a true native driver you won't have that feature on OSX.
Interesting.... But that's not the impressive stuff.Found a nice new video about how Nvidia develops and tests their GPUs. Each architecture is first emulated entirely in software running on massive servers. The servers themselves are then connected to PCIE slots as an "emulated graphics card" in various test bed systems including Windows and Linux PCs and of course they have cMPs in the lab too.
That explains the ridiculously high cost of video cards, and the fact that they never seem to truly go on sale until years down the road!Interesting.... But that's not the impressive stuff.
I've been in the Nvidia EDA labs. About a hundred racks, full of Dell 2U servers packed 18 per rack. Next to the lab are a pair of power/cooling buildings. Redundant. Each building could power and cool the lab by itself. And by "power and cool", that means each building had diesel backup generators to power the building, A/C and the lab.
These are the computers used to design the chips. The video is about the labs that emulate the design for verification.
No, those are explained by "people are happy to pay".That explains the ridiculously high cost of video cards, and the fact that they never seem to truly go on sale until years down the road!
No, those are explained by "people are happy to pay".
Found a nice new video about how Nvidia develops and tests their GPUs. Each architecture is first emulated entirely in software running on massive servers. The servers themselves are then connected to PCIE slots as an "emulated graphics card" in various test bed systems including Windows and Linux PCs and of course they have cMPs in the lab too.
given Apples recent shift towards Nvidia
Interesting that they have Mac Pros there, I wonder if that was a GP/P1xx series card they were testing with it? I assume they'd use them for OS X, though of course they'd work perfectly well for Linux or Windows.
Lends credibility to the hope that they'll release a new OS X web driver to support Pascal cards, possibly coinciding with the release of OS X 10.12.
Good news (potentially) for Mac Pro 2009/2010/2012 owners, though confusing, given Apples recent shift towards Nvidia. Perhaps they're always developing options for Apple to look into to tempt them away from their AMD exclusivity.
Apple's recent shift towards Nvidia? Did I miss some news?
Most likely the system is there to support the GPUs they have officially licensed for the Mac. Beyond Kepler progress will be slow and can only come if Apple still has an open door for them. I'm stating the obvious.
I will be getting a Pascal GPU to test in the cMP within the next month and report back all benchmark tests (if it works), though I stress that I will likely go for the 1070 as it strikes a great balance for me. Though I'm an Nvidia guy, I'm under no illusion about Pascal. It has great performance due to its high clock speed and efficiency, but clock for clock it is slower than Fury cards. So if Polaris delivers higher clocks and efficiency then AMD could win the crown this year.
If it the 1070 doesn't work in the cMP it goes in my Skylake PC anyway and I will test it against the 680 in the cMP just for fun.
Won't happen at all unless Apple listens and brings back a system with PCIe slots.All we need to do now is convince an OEM to actually make a new Mac Edition; EVGA for example. I've emailed them before about the 980, and recently emailed them about the 1080, and their response was typically vague and non-commital, but still.
Won't happen at all unless Apple listens and brings back a system with PCIe slots.
Which means "Won't happen".
What's the market for a $2500 display with average specs?Unlikely, certainly. Impossible, not so sure. Don't count eGPU out. A new Thunderbolt Cinema Display could have an embedded eGPU for Laptops/Mac minis to benefit from if bought in tandem. eGPU support would give an avenue for people without available PCIe slots room to manouver, which in turn would require a Mac EFI, and could also possibly work with Razors eGPU container as an example.
Sorry meant AMD, corrected my post.
[doublepost=1463699135][/doublepost]
I'm not so sure it's that rigid. If you're Nvidia, you're making things like the 1080/1070 anyway; why not test them on OS X, provided you have the hardware. Admittedly, the pre-2013 models form factor is now consigned to history, but there's still Thunderbolt 3 for eGPU things that OS X may well wish to leverage in the near future?
In the very interesting video you provided, they clearly stated that area was a lab they tested new hardware designs in; Kepler has been and gone, why would they be testing that in 2016?
Because the Kepler is still being supported in driver updates and many still have warranties. I have no doubt they would be testing newer architectures but they don't appear to have assigned much budget or manpower to Mac driver development as they haven't fixed some simple but crucial OpenCL and GUI bugs for many months now.
I think you missed my point; why would you have Mac Pros in a room to test new hardware, when officially the only Nvidia Mac card is Kepler. It's been around since 2012, I'm sure they'd have some 680s lying around.
I think it's more likely those machines were testing Pascal on OS X. Obviously I could be wrong, but I've no idea why you'd bother wasting so much computing power emulating a product that was finished and released 4 years ago.
What's the market for a $2500 display with average specs?
If I get it right, Soy means Nvidia is testing the new hardware, since they have to test the Kepler anyway, the Mac Pro is already there, why not? But they won't really spend lots resource to build the required software.
So, what he said is not contradicting to what you said (both agree that Nvidia test the new hardware on Mac Pro).
IMO, no matter will they release anything for OSX now, it make sense to test the system. Nvidia is a GPU manufacture, Apple is one of their protential customer. It make perfect sense to build and test the new hardware on the OSX side. And use the data on the next business meeting / negotiation.
If they have zero data on how Pascal GPU perform in OSX. How can they get the contract from Apple?
On business point of view, Apple is a big business partner. Why not do something to collect the useful data and get this protential contract? The testing facility is there anyway.
However, it doesn't mean that they will release anything for the OSX user (even though I personally believe that the coming web driver will support the Pascal card).
Yeah there has to be one branch of driver development that is always for testing. The other branch is to support officially released and licensed products. But the web driver has to be ported from the Windows code base. Unless the web driver makes a sudden big jump from version 346 to 368 (official support for Pascal in Windows) then it's a long wait, if ever.
If you watch the video again, he mentions testing on PC and Linux before finally and kind of reluctantly mentioning the Mac as an after thought. It's not that hopeful.
Hope you'll admit it this time when we have Pascal running on Macs before Polaris.