Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
The first post of this thread is a WikiPost and can be edited by anyone with the appropiate permissions. Your edits will be public.
Status
Not open for further replies.
Thank you, Synchro3, here's my iStat readout - does this read okay please? Never used this app or read any of these stats before so it's all a bit over my head. All this for a GPU renderer!

PCIe Boost A and PCIe Boost B are the number you should pay attention. They should be quite evenly distributed.

And anything below 6A is within the official max.

Anything below 8A should work.

If shows constant 7.99A, the actual power draw may be above 95W on each mini 6pin (7.99A is the max display limit).
[doublepost=1546100076][/doublepost]With the PowerLink, you should able to run Luxmark, Furmark, Unigine Valley, CUDA-Z Heavy Load test, etc without any issue.
 
Last edited:
Thank you h9826790, I just checked PCIe Boost A and PCIe Boost B and they're both almost identical now. Great.

I'm working on some renders today in Redshift so fingers crossed for a stable system. Honestly, this would be the best Christmas present for me if I can run this Mac without crashing!
 
Found this, this morning:

TinyGrab Screen Shot 1-3-19, 9.16.38 AM.png


Don't know if it's spam or the real thing. Probably the former13879dunno.gif

Lou
 
  • Like
Reactions: Synchro3
Shouldn't be a surprise. We know from previous that to get approval Nvidia has to meet Apple's Metal spec and guidelines. The most important feature Mac users are waiting for is eGPU support without using hack scripts.

Know from previous what, exactly? Where is NVIDIA not following the Metal spec or guidelines?
 
PCIe Boost A and PCIe Boost B are the number you should pay attention. They should be quite evenly distributed.

And anything below 6A is within the official max.

Anything below 8A should work.

If shows constant 7.99A, the actual power draw may be above 95W on each mini 6pin (7.99A is the max display limit).
[doublepost=1546100076][/doublepost]With the PowerLink, you should able to run Luxmark, Furmark, Unigine Valley, CUDA-Z Heavy Load test, etc without any issue.

I have 2 mini 6 pins to 6 pin connected to 6pin to 8+6 pin on my R9 280X Gygabyte. The Istat shows power draw 7.2 Amps on boost B only and 0 on boost A. But the current clamp across both cables coming from the mini ports shows 3.6A from each boost
 
I have 2 mini 6 pins to 6 pin connected to 6pin to 8+6 pin on my R9 280X Gygabyte. The Istat shows power draw 7.2 Amps on boost B only and 0 on boost A. But the current clamp across both cables coming from the mini ports shows 3.6A from each boost

Any screen capture?

Anyway, please avoid non Nvidia issue in this thread.

We can continue to discuss on one of those 280X threads.

Or you can make one specifically about 280X power distribution issue.
 
  • Like
Reactions: startergo
Know from previous what, exactly? Where is NVIDIA not following the Metal spec or guidelines?

I think he means full support for Metal 2 which Soy and h982670 mentioned many times.

Here's the Metal spec sheet.

https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf

If you look at the two columns on the right under Mojave you will see Family 1 and Family 2. These are the Metal revisions. Family 2 GPUs support more Metal features than Family 1.

Radeon Vega GPUs in Mojave have Family 2_v1 enabled.

If Nvidia haven't been able to meet the Family 2_v1 specs because of hardware limitations then they will remain on Family 1_v3 and Family 1_v4. This would be a problem for developers and could be buggy. Even more of a problem if Nvidia partners aren't going to offer tech support to Mac users.

I have attached a random collection of screenies. Sad that Nvidia doesn't even support 10 bit color output.


On a related subject there are two class action lawsuits against Nvidia related to the company being dishonest about their supply chain and demand. The lawsuits are going to be looking to see if Nvidia was using sock puppets to lie and encourage people to buy GPUs for mining or hackintoshes. Apparently this side of the demand caused a mess and hurt investors.
 

Attachments

  • imageproxy.php.jpeg
    imageproxy.php.jpeg
    125.6 KB · Views: 158
  • post-1362934-0-62438600-1510061881.png
    post-1362934-0-62438600-1510061881.png
    77.9 KB · Views: 78
  • TinyGrab Screen Shot 9-29-18, 10.33.33 AM.png
    TinyGrab Screen Shot 9-29-18, 10.33.33 AM.png
    120.7 KB · Views: 174
  • MacBookPro-5.png
    MacBookPro-5.png
    1.1 MB · Views: 186
I think he means full support for Metal 2 which Soy and h982670 mentioned many times.

Here's the Metal spec sheet.

https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf

If you look at the two columns on the right under Mojave you will see Family 1 and Family 2. These are the Metal revisions. Family 2 GPUs support more Metal features than Family 1.

Radeon Vega GPUs in Mojave have Family 2_v1 enabled.

If Nvidia haven't been able to meet the Family 2_v1 specs because of hardware limitations then they will remain on Family 1_v3 and Family 1_v4. This would be a problem for developers and could be buggy. Even more of a problem if Nvidia partners aren't going to offer tech support to Mac users.

I have attached a random collection of screenies. Sad that Nvidia doesn't even support 10 bit color output.

Where has Apple (or NVIDIA) publicly stated that the reason there aren't Mojave web drivers is because NVIDIA hasn't implemented Metal 2 yet? If you are speculating that this is the reason, then great, please stop posting as if this is a well-known fact. Note that the drivers Apple ships for NVIDIA Kepler GPUs also don't support Metal 2, but Apple seems more than happy to ship those with their OS.

The whole point of advertising different feature levels is so applications know what the system supports. Applications have been targeting Metal 1_v3 and 1_v4 for quite some time. Why do you suddenly think this would be a problem? If an application decides it only wants to run on Family 2_v1, then it simply wouldn't launch on any NVIDIA GPU. If their hardware (especially older Maxwell and possibly Pascal) hardware doesn't support a feature in Family 2_v1, then I don't understand why this is a problem for anyone (Apple, application developers, etc). The hardware simply doesn't support the feature set, and thus why would anyone expect that the driver would advertise it? This is the same for the 6+ year old Kepler hardware from NVIDIA, it's not like they can wave a magic wand and add features to hardware that is 4 or 5 generations old.

NVIDIA supports 10-bit color with their professional Quadro lineup.
 
  • Like
Reactions: bsbeamer
Where has Apple (or NVIDIA) publicly stated that the reason there aren't Mojave web drivers is because NVIDIA hasn't implemented Metal 2 yet? If you are speculating that this is the reason, then great, please stop posting as if this is a well-known fact. Note that the drivers Apple ships for NVIDIA Kepler GPUs also don't support Metal 2, but Apple seems more than happy to ship those with their OS.

The whole point of advertising different feature levels is so applications know what the system supports. Applications have been targeting Metal 1_v3 and 1_v4 for quite some time. Why do you suddenly think this would be a problem? If an application decides it only wants to run on Family 2_v1, then it simply wouldn't launch on any NVIDIA GPU. If their hardware (especially older Maxwell and possibly Pascal) hardware doesn't support a feature in Family 2_v1, then I don't understand why this is a problem for anyone (Apple, application developers, etc). The hardware simply doesn't support the feature set, and thus why would anyone expect that the driver would advertise it? This is the same for the 6+ year old Kepler hardware from NVIDIA, it's not like they can wave a magic wand and add features to hardware that is 4 or 5 generations old.

NVIDIA supports 10-bit color with their professional Quadro lineup.

We are allowed to speculate why there is a delay and why Nvidia needs Apple’s help. Why so defensive? I'm a shareholder and I don't get defensive like that.

Kepler won't support apps that use 2_v1 features. It's an old card that also doesn't support Direct X 12 on Windows. We are talking about new cards (and very expensive) and they shouldn't need to target Metal 1. Hopefully they still meet Metal 2 specs with Apple's help.

We should not need to spend more than $2K to get 10 bit color from Nvidia. That’s indefensible. They can enable it on their mainstream cards.

The BIG issue is that Nvidia needs to list which cards they officially support and end this perpetual beta which you are trying to deny. I'm a shareholder. I need to stay educated about companies I invest in. Those drivers are still not out of beta for Maxwell, Pascal, Turing, Volta and no board partners have been told to state 'macOS support' on the retail box or been given green light for Mac editions (if needed).

When Nvidia tells its board partners they can print 'macOS support' on the retail box and the driver download page mentions 9, 10, 20 series then it is out of beta. Then Nvidia and board partners have to provide tech support.

That's good business for them. They will sell more GPUs. Especially if there is plug n play GPU support and 10 bit color.
 
Last edited:
We are allowed to speculate why there is a delay and why Nvidia needs Apple’s help. Why so defensive? I'm a shareholder and I don't get defensive like that.

Kepler won't support apps that use 2_v1 features. It's an old card that also doesn't support Direct X 12 on Windows. We are talking about new cards (and very expensive) and they shouldn't need to target Metal 1. Hopefully they still meet Metal 2 specs with Apple's help.

We should not need to spend more than $2K to get 10 bit color from Nvidia. That’s indefensible. They can enable it on their mainstream cards.

The BIG issue is that Nvidia needs to list which cards they officially support and end this perpetual beta which you are trying to deny. I'm a shareholder. I need to stay educated about companies I invest in. Those drivers are still not out of beta for Maxwell, Pascal, Turing, Volta and no board partners have been told to state 'macOS support' on the retail box or been given green light for Mac editions (if needed).

When Nvidia tells its board partners they can print 'macOS support' on the retail box and the driver download page mentions 9, 10, 20 series then it is out of beta. Then Nvidia and board partners have to provide tech support.

That's good business for them. They will sell more GPUs. Especially if there is plug n play GPU support and 10 bit color.

It would be really good business for Apple and Nvidia to resolve this finally. Apple should take control of the driver situation and roll it into macOS.

Nvidia really does look like they have a couple of online shills on some macos related forums. There were very active shills on InsanelyMac and creepto mining forums. You can recognize them because they won't be pro users, they won't be gamers, they will never discover or complain about bugs, they try to control discussions, and they try to upsell you to more expensive cards. If they try to upsell you to a flashed card or a Quadro then that's a red flag.
 
We are allowed to speculate why there is a delay and why Nvidia needs Apple’s help. Why so defensive? I'm a shareholder and I don't get defensive like that.

Kepler won't support apps that use 2_v1 features. It's an old card that also doesn't support Direct X 12 on Windows. We are talking about new cards (and very expensive) and they shouldn't need to target Metal 1. Hopefully they still meet Metal 2 specs with Apple's help.

We should not need to spend more than $2K to get 10 bit color from Nvidia. That’s indefensible. They can enable it on their mainstream cards.

The BIG issue is that Nvidia needs to list which cards they officially support and end this perpetual beta which you are trying to deny. I'm a shareholder. I need to stay educated about companies I invest in. Those drivers are still not out of beta for Maxwell, Pascal, Turing, Volta and no board partners have been told to state 'macOS support' on the retail box or been given green light for Mac editions (if needed).

When Nvidia tells its board partners they can print 'macOS support' on the retail box and the driver download page mentions 9, 10, 20 series then it is out of beta. Then Nvidia and board partners have to provide tech support.

That's good business for them. They will sell more GPUs. Especially if there is plug n play GPU support and 10 bit color.

I'm not sure why you think I'm being defensive? I've politely asked that if you are posting speculation, particularly in this thread, that you clearly indicate that it is your theory or opinion rather than an obviously well-known fact. I really don't think that's too much to ask, because it avoids a lot of confusion. Maybe it's just me, because when I see these things posted as fact I assume that I've missed some news because I just don't follow these things as closely as I used to.

I haven't looked at the Metal 2 specs/features in detail, but I would imagine that Pascal+ could support it. That is, of course, assuming that Apple has provided all the necessary information to NVIDIA (internal frameworks and kexts, documentation, and so on). Nobody outside of Apple and NVIDIA knows the real reason why there hasn't been a Mojave web driver, and I've posted my theories in the past.

If you want 10-bit color on cheap consumer hardware, buy AMD. I've been saying this for years. NVIDIA believes this is a premium feature, and thus includes it only with their professional Quadro line. If you don't like that, buy an AMD card instead.

When have I ever tried to deny the beta status (at best) of the Maxwell+ GPUs? 6 years ago in the first version of my FAQ post, I even included information about how the cards should work but that they are not officially supported. I've been saying that ever since, even when NVIDIA officially recognized that PC cards would work on macOS with their web driver but that it was beta support. It sounds like you're asking for them to be officially supported like they are on Windows and Linux, and quite frankly I just don't see that happening anymore. One might assume NVIDIA was working towards having every card support the cMP and have a boot screen given the presence of such a UEFI on the Turing cards, which would be an important step towards all cards being officially supported under macOS. However, the lack of a Mojave web driver suggests that this is either no longer the case, or that we are at the very least a long way off from such a situation (perhaps because Apple doesn't want this). If you don't like this, then just go and buy an AMD card, which is very clearly the message we are getting from Apple right now in my opinion.

I also really don't think it's as simple as saying "oh we should just end the beta status, and provide full customer support on macOS". I would imagine there's a huge difference in the number of support staff NVIDIA has for Windows versus macOS. They'd either have to train or hire a ton of people to provide the same level of support as Windows. They'd likely have to include Hackintosh support, which opens up a whole new can of worms. It's been pretty clear to me that NVIDIA simply does not want to do this. For a long time, up until several years ago at least, you could basically just buy any Maxwell card and it'd just work. Then, Apple started revving their OS build numbers every couple of weeks with security updates, which meant you had to start updating the web driver more than once a quarter. It was around this time the relationship between the two companies started to go downhill, and the macOS driver quality has gotten worse and worse. My take on this has always been that NVIDIA simply does not care as much about macOS as it used to, and is devoting fewer resources to supporting it.

As I've been saying clearly for many years now, the end result is pretty simple: don't buy an NVIDIA GPU for use under macOS. Personally, games were always my focus and I've just switched to Windows 10 on my Hackintosh and haven't even booted macOS in a few years now.
[doublepost=1546797912][/doublepost]
It would be really good business for Apple and Nvidia to resolve this finally. Apple should take control of the driver situation and roll it into macOS.

Why do you think this? Apple already has a driver in macOS that supports the NVIDIA GPUs that Apple cares about (i.e. Kepler). I can't imagine Apple actually wants to make it easy for people to keep using their nearly-decade-old cMP machines, they would much rather sell them a new iMac Pro or perhaps the new modular Mac Pro when it comes out. They already have an eGPU solution that involves AMD cards. Why exactly does Apple care about NVIDIA GPUs at this point? Sure, there are vocal groups of people who post on internet forums like this one, but I'd imagine that's a tiny fraction of Apple's customer base.
 
I'm not sure why you think I'm being defensive? I've politely asked that if you are posting speculation, particularly in this thread, that you clearly indicate that it is your theory or opinion rather than an obviously well-known fact. I really don't think that's too much to ask, because it avoids a lot of confusion. Maybe it's just me, because when I see these things posted as fact I assume that I've missed some news because I just don't follow these things as closely as I used to.

I haven't looked at the Metal 2 specs/features in detail, but I would imagine that Pascal+ could support it. That is, of course, assuming that Apple has provided all the necessary information to NVIDIA (internal frameworks and kexts, documentation, and so on). Nobody outside of Apple and NVIDIA knows the real reason why there hasn't been a Mojave web driver, and I've posted my theories in the past.

If you want 10-bit color on cheap consumer hardware, buy AMD. I've been saying this for years. NVIDIA believes this is a premium feature, and thus includes it only with their professional Quadro line. If you don't like that, buy an AMD card instead.

When have I ever tried to deny the beta status (at best) of the Maxwell+ GPUs? 6 years ago in the first version of my FAQ post, I even included information about how the cards should work but that they are not officially supported. I've been saying that ever since, even when NVIDIA officially recognized that PC cards would work on macOS with their web driver but that it was beta support. It sounds like you're asking for them to be officially supported like they are on Windows and Linux, and quite frankly I just don't see that happening anymore. One might assume NVIDIA was working towards having every card support the cMP and have a boot screen given the presence of such a UEFI on the Turing cards, which would be an important step towards all cards being officially supported under macOS. However, the lack of a Mojave web driver suggests that this is either no longer the case, or that we are at the very least a long way off from such a situation (perhaps because Apple doesn't want this). If you don't like this, then just go and buy an AMD card, which is very clearly the message we are getting from Apple right now in my opinion.

I also really don't think it's as simple as saying "oh we should just end the beta status, and provide full customer support on macOS". I would imagine there's a huge difference in the number of support staff NVIDIA has for Windows versus macOS. They'd either have to train or hire a ton of people to provide the same level of support as Windows. They'd likely have to include Hackintosh support, which opens up a whole new can of worms. It's been pretty clear to me that NVIDIA simply does not want to do this. For a long time, up until several years ago at least, you could basically just buy any Maxwell card and it'd just work. Then, Apple started revving their OS build numbers every couple of weeks with security updates, which meant you had to start updating the web driver more than once a quarter. It was around this time the relationship between the two companies started to go downhill, and the macOS driver quality has gotten worse and worse. My take on this has always been that NVIDIA simply does not care as much about macOS as it used to, and is devoting fewer resources to supporting it.

As I've been saying clearly for many years now, the end result is pretty simple: don't buy an NVIDIA GPU for use under macOS. Personally, games were always my focus and I've just switched to Windows 10 on my Hackintosh and haven't even booted macOS in a few years now.
[doublepost=1546797912][/doublepost]

Why do you think this? Apple already has a driver in macOS that supports the NVIDIA GPUs that Apple cares about (i.e. Kepler). I can't imagine Apple actually wants to make it easy for people to keep using their nearly-decade-old cMP machines, they would much rather sell them a new iMac Pro or perhaps the new modular Mac Pro when it comes out. They already have an eGPU solution that involves AMD cards. Why exactly does Apple care about NVIDIA GPUs at this point? Sure, there are vocal groups of people who post on internet forums like this one, but I'd imagine that's a tiny fraction of Apple's customer base.

I’m not going to read your giant empty paragraphs. It’s word soup. You have been playing all kinds of mental gymnastics games with people like me for 4-5 years. We report bugs, you deny them, we show you them, you brush them under the carpet, you kept recommending certain upgrades to pro users that were not suitable for them.

I have provided 4-5 years worth of real world results, benchmarks and facts. I have saved people from spending money they didn’t need to. I did all the upgrades for everyone so they could get an informed decision.
 
Last edited by a moderator:
  • Like
Reactions: CreeptoLoser
I’m not going to read your giant empty paragraphs. It’s word soup. You have been playing all kinds of mental gymnastics games with people like me for 4-5 years. We report bugs, you deny them, we show you them, you brush them under the carpet, you kept recommending certain upgrades to pro users that were not suitable for them.

I have provided 4-5 years worth of real world results, benchmarks and facts. I have saved people from spending money they didn’t need to. I did all the upgrades for everyone so they could get an informed decision.

When have I denied bugs? I've repeatedly asked for a list of known issues with the most recent drivers, because many of the bugs you keep talking about have been fixed for a long time (or at least this was the case a few years ago, but you keep bringing these bugs up). If there are still outstanding bugs, then I'd actually like to document them as a FAQ in the first post of this thread, but you keep refusing to provide more information.

When have I recommended certain upgrades to pro users, other than telling everyone to just buy an AMD card for the last 3 or more years? And, before that, to just pick the card that works best for your usage case? For me, since I play games, that meant NVIDIA cards worked better in general, so that's what I bought for myself. For "pro" customers who want to buy cheap consumer-level cards for "pro" work, I've been saying you should buy AMD for as long as I can remember. AMD cards just seem to work better in macOS these days, especially for Apple apps like Final Cut Pro.

Anyway, I don't even use macOS myself anymore and am very happy playing games under Windows 10. It's kind of silly how much better everything runs under Windows, so I have no plans to ever return to macOS (especially given the current state of things between NVIDIA and Apple). I really don't think I've been unreasonable in my requests to you, but I guess we can just agree to disagree. At the end of the day I think we basically have the same position, which is that people shouldn't buy NVIDIA cards to run under macOS, and I don't really understand why you have to make this discussion so difficult.
 
Last edited by a moderator:
urhm . . . lately, recent psots are missing?

. . . who is editing this Thread, and why does it ([seem-to-only] resolve (since 2019.01.06)) just the parle between Ag and Soy?!?

Is my browser history (iPhone/HS Safari) borked?

Regards, splifingate
 
There are plenty of active threads discussing AMD's recent GPUs that are more appropriate places to discuss those topics. This thread is really for people who need help getting their NVIDIA GPUs to work.
 
  • Like
Reactions: TheStork
This thread is really for people who need help getting their NVIDIA GPUs to work.
That would be me. *sigh*

What's the latest? Do I remember correctly that apparently NVidia is working with Apple on fulfilling the Metal 2 specs on Mojave? Or do I just go and get an AMD card? I quite seriously hoped to see the Mojave drivers on the 2nd of Jan.
 
That would be me. *sigh*

What's the latest? Do I remember correctly that apparently NVidia is working with Apple on fulfilling the Metal 2 specs on Mojave? Or do I just go and get an AMD card? I quite seriously hoped to see the Mojave drivers on the 2nd of Jan.

You're waiting just like everyone else. Not entirely shocked it wasn't released on Jan 1 or 2. If some of the info posted above is to be believed, seems as though Apple just recently shared some of the necessary OS-level requirements with NVIDIA. Does that mean they just started working on the driver, or does that mean they just started working on a compliant driver? Time will tell...

If you NEED to be on Mojave, then you'll need an AMD card (like Sapphire Pulse RX580).

If you can stay on High Sierra (a viable option for many), your system and GPU will likely continue to work as it has with NVIDIA Web Drivers.

The "Metal 2 specs" is just a floating conspiracy out there. The GTX 680 is an officially supported GPU for Mojave by Apple directly (without CUDA support or Web Drivers), but the GPU does not support Metal 2.
 
  • Like
Reactions: Speedstar
The GTX 680 is an officially supported GPU for Mojave by Apple directly (without CUDA support or Web Drivers), but the GPU does not support Metal 2.
True, but cMP is now vintage... so who knows
[doublepost=1547272740][/doublepost]My guess is there will be a new kid on the block this year because nMP will be vintage next year... Or they will kill the whole line...
 
True, but cMP is now vintage... so who knows
[doublepost=1547272740][/doublepost]My guess is there will be a new kid on the block this year because nMP will be vintage next year... Or they will kill the whole line...

Just wonder do they ever put a Mac into the vintage list when the model still under Apple care?

Apple still selling the nMP (and Apple care) for this model now.

If Apple put a nMP into the vintage list next year, that means most of the place on the world will no longer recieve hardware support for nMP next year, but the nMP itself may still under Apple care.
 
Just wonder do they ever put a Mac into the vintage list when the model still under Apple care?

Apple still selling the nMP (and Apple care) for this model now.

If Apple put a nMP into the vintage list next year, that means most of the place on the world will no longer recieve hardware support for nMP next year, but the nMP itself may still under Apple care.
Maybe they put it on the vintage list when they stop selling it as new (assuming they stopped selling cMP in 2013). Or do they just go by the year of manufacturing?
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.