Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As with the first iteration of the 378 driver, yes it's 10.12.4 only. I am also puzzled by your disdain of Sierra?

Lou

Sorry didn't mean to start a flaming war...
I don't have disdain for Sierra, you're misreading. It's just that I prefer El Capitan as it's less closed by all the "security features" and there's still some room for improvement. Also, Sierra does with Metal on cMP, where Capitan still has it supported.

I use my Mac Pro as a business tool, and most of my apps are fully supported on El Capitan if I need to call their helpline. For instance, I'm a film editor, Avid qualified Media Composer on 10.12.3, not on 10.12.4 - despite a new update to the app AFTER the release of the latest version of Mac OS.

In my line of work, most studios are still running on El Capitan - ok some are even still on 10.8.5, considered the most stable workhorse before the iOSization of Mac OS. Yes, you can argue that "why do I need a new Pascal card then, if I'm so conservative". Well I like to have choices. And that might involve sticking to a GTX 980Ti.

Peace.
 
App Store version of Da Vinci Resolve is the same as their site's download version and uses the same core code as the Windows version.

The Mac AppStore version and the Studio version from Blackmagic are not the same. While the underlying core technology is the same of course, there are feature differences (which is also reflected in the price). They don't just give the MacOs version away for half price.

Some relates to OFX plugins and there is also no CUDA in the AppStore version. For some reason that had escaped me.
 
  • Like
Reactions: NY Guitarist
Single slot 1070...
http://www.techspot.com/news/68947-galax-announces-world-first-single-slot-geforce-gtx.html

2017-04-13-image-10.png

Interest at first...but port selection is cheap.....orz why only one displayport. better 3 DP and 1 HDMI.
[doublepost=1492867022][/doublepost]
Sleep is OOB for virtually all recent mainboards. In the earlier days it required DSDT patching, but that's not necessary any more.

.

Yeah, most hardware from z77 mostly sleep OOB, and no need DSDT but I'm pretty encouraged everybody who build hackintosh to learn at least basic DSDT knowledge once their system run. Learning DSDT have many advantages from disabling unused device, re-route device, making better and cleaner device tree on your IOREG. Some advance feature it's also can be enabled like renaming MAC address, enabling auto-switching GPU (for iGPU and discrete one). Also, DSDT is very flexible, you can create smaller, modular version called SSDT, usually for GPU/removable PCIE device.

I'm start by extracting my vanilla DSDT from my 5.1 and learn how to read ACPI tables and do comparing DSDT from my motherboard. It's meh. Found many weird reference name and mess. Thank's to vanilla DSDT 5.1 as reference, i can patched and renaming device as it should exactly like DSDT from apple device.

Stock DSDT are dirt and have at least some errors. Consider to learning it, it make lives easier if you want create super proper hackintosh behave like genuine one.
 
The Mac AppStore version and the Studio version from Blackmagic are not the same. While the underlying core technology is the same of course, there are feature differences (which is also reflected in the price). They don't just give the MacOs version away for half price.

Some relates to OFX plugins and there is also no CUDA in the AppStore version. For some reason that had escaped me.

And if you try the free version of Resolve? Afaik one card and CUDA is supported.

Just found out that the free Resolve 12.5 doesn't support CUDA:
https://www.blackmagicdesign.com/products/davinciresolve/compare
 
Last edited:
Yes. That's where I made my CUDA benchmark. CUDA works perfectly. But my investment is in the Mac App Store version for full 4K support.
 
In that case yes, I agree. When I said the App Store version I refer to the paid version for 500 bucks.
[doublepost=1492871777][/doublepost]
Just found out that the free Resolve 12.5 doesn't support CUDA: https://www.blackmagicdesign.com/products/davinciresolve/compare

Every DaVinci Resolve version supports CUDA rendering except, apparently,the $499 Mac App Store version.

The MacAppStore version is the full studio version of Resolve with a few limitations (including CUDA) in functionality for half the price— $500 instead of $1000
 
In that case yes, I agree. When I said the App Store version I refer to the paid version for 500 bucks.
[doublepost=1492871777][/doublepost]

Every DaVinci Resolve version supports CUDA rendering except, apparently,the $499 Mac App Store version.

The MacAppStore version is the full studio version of Resolve with a few limitations (including CUDA) in functionality for half the price— $500 instead of $1000

That's sad because the free download version for Windows has the options attached.
 

Attachments

  • image.jpg
    image.jpg
    2 MB · Views: 152
I downloaded driver 378.05.05.05f02 and threw a gtx 1070 sc in a Mac Pro 2010. It boots up fine and gets recognized in About this mac. Is there any more steps i need to do? Seen a video on youtube where a person went into terminal to do nv_disable=1 command. Am i suppose to do that? I did not. Not seeing any improvement in Warcraft and Diablo 3. Same low fps as the mac edition gtx 680 2gb i just took out. I couldn't find a step by step to install driver and pascal card. Cheers


Mac Pro 2010
Apple 27' 1440p
 
I downloaded driver 378.05.05.05f02 and threw a gtx 1070 sc in a Mac Pro 2010. It boots up fine and gets recognized in About this mac. Is there any more steps i need to do? Seen a video on youtube where a person went into terminal to do nv_disable=1 command. Am i suppose to do that? I did not. Not seeing any improvement in Warcraft and Diablo 3. Same low fps as the mac edition gtx 680 2gb i just took out. I couldn't find a step by step to install driver and pascal card. Cheers


Mac Pro 2010
Apple 27' 1440p

No need for Terminal commands on a real Mac.

The driver is only optimised for Kepler so you will often see Kepler like performance in OpenGL despite the clock speed being almost double.
 
That's sad because the free download version for Windows has the options attached.

Yeah. To clarify, the free MacAppStore Version has it too!! =) It does get kind of confusing...

Blackmagic offers the Studio and the free versions. The free versions are available on both Win and Mac and they have limitations, but CUDA isn't one of them. But free Mac = free Win.

The $499 MacAppStore version (the one I have) is meant to be the "full" Studio version with full DCI 4k support, dual GPU support, noise reduction and stuff like that. I knew that it wasn't 1:1 compared to the $995 Studio version, but I didn't know they had stripped CUDA away. I mean, it IS only half price, but still.
 
  • Like
Reactions: Synchro3
Yeah. To clarify, the free MacAppStore Version has it too!! =) It does get kind of confusing...

Blackmagic offers the Studio and the free versions. The free versions are available on both Win and Mac and they have limitations, but CUDA isn't one of them. But free Mac = free Win.

The $499 MacAppStore version (the one I have) is meant to be the "full" Studio version with full DCI 4k support, dual GPU support, noise reduction and stuff like that. I knew that it wasn't 1:1 compared to the $995 Studio version, but I didn't know they had stripped CUDA away. I mean, it IS only half price, but still.

I see. That means the App Store version is a specially tailored for the nMP and MBP with Radeons only.
 
Yep.

I just wish they didn't remove stuff already in the free version, when upgrading to the half-step Studio version. I guess it's kind of problematic since the free version is already very generous for what it can do.
 
No need for Terminal commands on a real Mac.

The driver is only optimised for Kepler so you will often see Kepler like performance in OpenGL despite the clock speed being almost double.

Oh we're back to this again? Games will be completely CPU limited on a 2010 Mac Pro unless you're running at 4K resolution with every setting at its maximum, and in some cases you'll be CPU limited even at that resolution. Do we really need to keep having this discussion? It's easily verifiable with an app like iStat Menus that shows GPU utilization, which will be very low in a CPU-limited game like WoW.
 
Oh we're back to this again? Games will be completely CPU limited on a 2010 Mac Pro unless you're running at 4K resolution with every setting at its maximum, and in some cases you'll be CPU limited even at that resolution. Do we really need to keep having this discussion? It's easily verifiable with an app like iStat Menus that shows GPU utilization, which will be very low in a CPU-limited game like WoW.

Excuse maker and forum sales boy. This isn't happening so badly on Windows on a cMP even on OpenGL.

You're now asking people to buy a 4K monitor to get around this slow performance. Another big cost that isn't worth it if a professional is using an old machine and a graphics card running beta drivers?

Have you apologised to everyone you cajoled into buying an expensive card only to find that they suffered bugs and slower than expected performance? This is a Pro forum, when you recommend upgrades to them you better make sure their professional work isn't impacted otherwise I would recommend they start a class action to find out why they are being convinced to buy hardware upgrades with beta drivers, sometimes very overpriced upgrades because of some hacking required.

Or are you going to keep shut me down with your excuses and conspiracy theories about me? What was it last week, that I had an agenda against Nvidia....and then 1000 readers laughed at you.

Did Maxwell support come out of beta yet after 2+ years? Didn't think so. No official Mac product or driver listing anywhere.
 
Last edited:
The driver is only optimised for Kepler so you will often see Kepler like performance in OpenGL despite the clock speed being almost double.

I don't know why you're still insisting this, although the opposite has been proven multiple times.

Nvidia Maxwell or Pascal GPUs will perform perfectly fine in OS X in GPU limited benchmarks. The performance difference compared to Windows in such a situation is less than 10%, not notable at all.

The major difference is that the CPU overhead in APIs and drivers is a lot higher in macOS, so benchmarks, games and apps are more likely to be CPU bound than in Windows.

This is absolutely not specific to Nvidia WebDrivers though, the same thing happens with perfectly well supported GPUs like Nvidia Kepler, AMD 1st gen GCN or newer AMD GPUs. Those GPUs just aren't as fast, so lots of people won't notice.
 
  • Like
Reactions: TheStork
I don't know why you're still insisting this, although the opposite has been proven multiple times.

Because it's a fact. You guys are still going to a Kepler driver download page and then with blinders on your eyes saying that you are not. LOL.


Answer this, is Maxwell out of beta yet after 2+ years and if so where is the Maxwell driver for Mac listing and official products?

Remember, you are impacting professional workflows with your recommendations. This is not a gamer forum. If professionals take your advice and then find slow performance and compute bugs they really should make an example of you.
 
You were talking about OpenGL performance and that's what I've responded to.

I can't comment on the quality/stability of pro apps in conjunction with the WebDrivers (and I'm pretty sure I never did) since my "professional workflow" is 100% Windows based.
 
Remember, you are impacting professional workflows with your recommendations. This is not a gamer forum. If professionals take your advice and then find slow performance and compute bugs they really should make an example of you.

Well, I would assume that "professionals" know that the Geforce line is an enthusiast level offering from nVidia. I.E. gamer hardware, those looking for professional level graphics and support would want to look at the Quadro line.

nVidia hasn't offered a Quadro since the K5000 for Mac, and I don't see anything from nVidia, that backs up your claim that their drivers for official Mac Retail Products are bata.

nVidia's offerings of Maxwell and Pascal PC cards drivers, that enable these cards to work under OS X, has never been a Professional offering, nor has there ever been any sort of claim of support, on any level, from nVidia.

While I'm less than thrilled at the "Professional" support in the drivers for my Quadro card, under the MacOS, nVidia never made any claims that it would have parity with Windows or Linux drivers and support.

I don't really understand why the MacOS is such a dog slow pig, when it comes to some Professional features of the Quadro line, but AMD's FirePro D300/D500/D700 isn't any faster, under the MacOS, at these features. So much for AMD's "Professional" level drivers for the MacOS.

While I admit, that AMD has a decided edge in certain "Pro" applications under the MacOS that use OpenCL, still other "Pro" apps use Cuda, and we all know that dog won't hunt on AMD cards.

Anyone that is using bata drivers, with PC hardware, under the MacOS, does so fully understanding that nVidia makes no claims that any software or hardware combination used, would receive any real level of support from nVidia.

I didn't see anyone here claiming they would offer Professional level support for any nVidia product. MVC does offer some level of support for his flashed nVidia products, but anyone that buys from him should understand that he has no official relationship with nVidia, and no access to the driver codebase, thus is unable to fix bugs in nVidia's drivers, or work with Apple to better optimize support for features, like, but not limited to OpenCL.
 
Because it's a fact. You guys are still going to a Kepler driver download page and then with blinders on your eyes saying that you are not. LOL.


Answer this, is Maxwell out of beta yet after 2+ years and if so where is the Maxwell driver for Mac listing and official products?

Remember, you are impacting professional workflows with your recommendations. This is not a gamer forum. If professionals take your advice and then find slow performance and compute bugs they really should make an example of you.

Please show me where I told people to go out and buy a 1080 Ti? I don't think I've ever recommended that, in fact my consistent message has been "you are likely going to be CPU limited in all games unless you're at 4K resolution" which sounds more like a reason to not buy one. Someone posted that WoW got no faster, you posted your usual drivel about how the drivers are beta Kepler compatibility mode or whatever, and so I responded that WoW will be CPU limited and it has nothing to do with the drivers or GPU.

There are plenty of "pro" usage cases that could benefit from a 1080 Ti:

http://barefeats.com/cmp_pascal.html

because those are actually GPU limited, even in a cMP. Again, to be clear since apparently you have trouble comprehending these things, I'm not actually suggesting people go out and buy such a card. There are no official products and the drivers aren't out of beta because there's no official product that Apple is selling that can use these cards. So, if you're willing to live with beta drivers, by all means use these cards if they're a good fit for the applications you care about. Personally, I've been using a GeForce TITAN X (GM200) and now a GTX 1080 (GP104) without any major issues and no bugs in the games I play, so it's working well for me. What are the alternatives? AMD has no cards that can compete with these high-end GPUs.
[doublepost=1492964685][/doublepost]
I don't really understand why the MacOS is such a dog slow pig, when it comes to some Professional features of the Quadro line, but AMD's FirePro D300/D500/D700 isn't any faster, under the MacOS, at these features. So much for AMD's "Professional" level drivers for the MacOS.

For reference, this is 100% Apple's fault. On macOS, they control the OpenGL framework, and the drivers plug into that at a lower level. On Windows and Linux, the drivers provide the OpenGL API themselves, which means the drivers are in full control of the entire stack. NVIDIA in particular has always had extremely good OpenGL drivers on Windows and Linux, so it's always been frustrating that Apple wouldn't just let them ship the same driver as they do on the other platforms (like they used to back in the OS 9 days).

As more apps transition to Metal, this problem of Apple's OpenGL implementation having a massive amount of CPU overhead will go away, and we'll be more likely to find cases that actually stress the GPU. It's taking a long time for this to happen of course.
 
For reference, this is 100% Apple's fault. On macOS, they control the OpenGL framework, and the drivers plug into that at a lower level. On Windows and Linux, the drivers provide the OpenGL API themselves, which means the drivers are in full control of the entire stack. NVIDIA in particular has always had extremely good OpenGL drivers on Windows and Linux, so it's always been frustrating that Apple wouldn't just let them ship the same driver as they do on the other platforms (like they used to back in the OS 9 days).

As more apps transition to Metal, this problem of Apple's OpenGL implementation having a massive amount of CPU overhead will go away, and we'll be more likely to find cases that actually stress the GPU. It's taking a long time for this to happen of course.

Thanks for that, I often wanted to accuse Apple of being the source of the trouble with Quadro/FireGL lack of performance under OpenGL on the MacOS, but never had any real ammunition for the argument.

I can understand Apple's prospective, from some point, most of the applications that can take advantage of the Quadro/FirePro feature set don't run on the MacOS, but if the developers are going to port these apps, Apple is going to have to show that they can offer "Pro" level performance on Quadro/FirePro hardware, first.
 
Personally, I've been using a GeForce TITAN X (GM200) and now a GTX 1080 (GP104) without any major issues and no bugs in the games I play, so it's working well for me.

What is your experience with this? Do you feel a performance difference between these cards?

I've been contemplating a switch from my GTX Titan X Maxwell, but am awaiting input from folks who've done the switch to a 1080 or 1080 Ti. I'm primarily interested in its compute rendering performance in Adobe cc (primarily premiere pro and media encoder). And the occasional game of league of legends (which plays mostly at 60fps with all settings on max with my Titan X on my DCI 4K monitor... although fps do go down a bit to around 40fps currently in any intense battle scene).
 
[QUOTE="SoyCapitanSoyCapitan, post: 24520183, member: 976766"This is not a gamer forum.[/QUOTE]

One more thing: this is a forum discussing the Mac Pro. People use Mac Pros for all kinds of things, including playing games. So, I disagree with your assessment that this forum is for discussing professional topics only. This thread in particular is talking about the web drivers for Pascal, which again, people will be using to do many kinds of things including playing games.
 
GTX 1080 is working great on my machine except for this: In Google Chrome, YouTube videos look really washed out. It's really bizarre. Anyone else experiencing this? Anyone know of a fix?

EDIT: Here's a screenshot. Left is Chrome, right is Safari. It's subtle, but it's still really annoying. Chrome looks more milky.
osSZhcL.jpg
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.