Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
At worst, maybe the RTX series cards will eventually work with macOS via eGPU (when drivers are fully available) without the enable hacks? I have not checked in on eGPU since Mojave, but last I knew TB1/TB2 & NVIDIA cards required hacks to enable and AMD were the only officially supported options.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
At worst, maybe the RTX series cards will eventually work with macOS via eGPU (when drivers are fully available) without the enable hacks? I have not checked in on eGPU since Mojave, but last I knew TB1/TB2 & NVIDIA cards required hacks to enable and AMD were the only officially supported options.

I'm not sure eGPU supports boot screen at all (Macs are pretty picky about which Thunderbolt devices they'll reach out to pre-boot for security reasons), but anything on a Thunderbolt 3 Mac would be GOP, not UGA. So these changes don't apply to Thunderbolt 3 Macs.

I think eGPU on Nvidia has also always been a driver issue and not a firmware issue. It's just you can work around the driver issue in firmware. So in theory all previous Nvidia cards could work with eGPU as well.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
UEFI dropped UGA, ...

So I'm trying to figure out what other UGA boxes there are out there that can do an EFI boot of Windows, and I'm not coming up with anything. I don't think Windows requires full UEFI to boot, but even then it's odd timing to suddenly decide to start supporting EFI booting on some really old Windows boxes.

Doesn't have to be Windows. These cards are useful for more than just playing video games. Most of the AI learning/inference software foundation is in Linux/Unix. An older cluster with EFI servers would match. Those too though would be getting close to decommission /vintage/obsolete time too. Neither one is a huge chunk of a new market but if bigger ROMs are cheaper now and the code was relatively inexpensive to add it won't hurt.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Doesn't have to be Windows. These cards are useful for more than just playing video games. Most of the AI learning/inference software foundation is in Linux/Unix. An older cluster with EFI servers would match. Those too though would be getting close to decommission /vintage/obsolete time too. Neither one is a huge chunk of a new market but if bigger ROMs are cheaper now and the code was relatively inexpensive to add it won't hurt.

Yeah, I was thinking about that earlier. There could be some really old non Windows cluster out there that they need UGA for.

It's still cool that the Mac Pro 5,1 doesn't need a "Mac edition" card for boot screens, just UGA support. Even though UGA support is pretty rare.
 

bookemdano

macrumors 68000
Jul 29, 2011
1,514
846
Right, but he still would have been able to access the boot screen on the Mac Pro. We know the 2080 doesn't work in macOS without drivers, which we still don't have.

But is he on 138.0.0.0.0? CreatePro said pretty definitely that 138.0.0.0.0 was the key to this.

I guess he would be if he was running Mojave. Still though, it is possible to boot Mojave with an older firmware so might be worth verifying.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
...
But this may be Nvidia making a point to Apple more than some grand alliance. Again, it’s extremely unlikely that a new Mac Pro would use the same boot screen standard as the 5,1. To me this is Nvidia keeping the pro Mac user scene churned up so they can continue trying to leverage Apple.

Polaris and Vega, combined with Mojave and Apple's blessing of those cards, are threatening to completely push them off the Mac.

Not just "scene churned up". Remember that Mojave is the end of the line for OpenGL and OpenCL. Nvidia has their proprietary CUDA realm to protect. Nvidia will have to get fully on board with Metal 2 (and follow ons) but these new GTX card bring in a whole another chunk of proprietary infrastructure.

Apple nudging people toward their "official Mojave card" list means incrementally less people on proprietary GPU stack. ( yes there are hacks to enable what is not on the official list, but some folks will move with the list. )

Tactically Apple could be making a blunder here. In the move off OpenGL/CL Apple is shifting people to look into revising their code. For some aspects developers will have Metal (Apple) versus CUDA (Nvidia) and no open standards option. That "churn" is primarily being driven by Apple. They have injected an inflection point for development for a number of software code bases.

If the Mac Pro community is all they have left, and they loose that, well then they have no leverage left with Apple.

I don't think that is all they have left but it is the subset community which is the deepest into the CUDA tarpit. It is in Nvidia's best interest to keep entrenched customers just as deeply in that state as they are now. As stated above, GTX is an opportunity to pull them even deeper into lock-in. That they can either stay on 5,1 or move (Win/Linux) basically lets them bite down even harder on the bait.

It would be kind of goofy to try to leverage Apple though. Nvidia could present themselves as an option (dependable component supplier ... it is also helpful in retention for Apple also. ). Playing some "deal with us or else" card with Apple is more the dubious. Even if Nvidia doesn't get into the next Mac Pro as a standard option if they can get into the empty slot of that Macs have access to that is a win. ( new Mac Pro with an empty standard slot and/or external PCI-e card enclosures (boxy eGPUs ) empty slots . ) There are multiple vectors here that simply just the Mac Pro path.
 
  • Like
Reactions: Reindeer_Games

thornslack

macrumors 6502
Nov 16, 2013
410
165
I mean, NVIDIA makes the best gpus. If Apple makes a pro machine that totally ignores that... well it would be a little absurd.
 
  • Like
Reactions: stevekr

Ludacrisvp

macrumors 6502a
May 14, 2008
797
363
I'll just hope for an affordable $200 or less version of this card like a 2050 or something that can run 4+ displays and not require SSE4.2.
 
Jul 4, 2015
4,487
2,551
Paris
I mean, NVIDIA makes the best gpus. If Apple makes a pro machine that totally ignores that... well it would be a little absurd.

No 10 bit color output. No HEVC decode in web drivers. OpenCL bugs in web drivers. 1080 was beaten in some OpenCL tasks by Vega.

Even cheap Radeon cards do 10 bit color output.
 
  • Like
Reactions: eksu

thornslack

macrumors 6502
Nov 16, 2013
410
165
No 10 bit color output. No HEVC decode in web drivers. OpenCL bugs in web drivers. 1080 was beaten in some OpenCL tasks by Vega.

Even cheap Radeon cards do 10 bit color output.

I would put those deficiencies on Apple more than NVIDIA. All of that stuff works in windows. It’s fairly amazing the green team has offered the continuing support they do considering their last Apple blessed product was the GTX 680.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
There is still no 10 bit output on Geforce on Win or Mac. This is the issue that remains and is a deal breaker for pros working with color. I'd rather not spend ridiculous 20 series prices and wait for 7nm Navi, if you're thinking of upgrading that old cMP.

I'm no fan boi so that's my unbiased opinion, and I sold me AMD stock at $32. I still own NVDA stock.

Why are pros looking to use consumer gaming cards for their work? I don't understand. NVIDIA has a professional line of graphics cards that enable 10-bit output, at least under Windows.

As always, if you have a specific usage case and there is a brand of GPUs that works best for that usage case, then you should buy a GPU from that brand. If your usage case is consumer gaming cards for professional work that requires 10-bit color on macOS in a nearly decade-old computer, then you should absolutely buy an AMD GPU.
[doublepost=1539964309][/doublepost]
I think eGPU on Nvidia has also always been a driver issue and not a firmware issue. It's just you can work around the driver issue in firmware. So in theory all previous Nvidia cards could work with eGPU as well.

My understanding is that the final switch is in an Apple-provided framework, i.e. last time I checked Apple was simply not allowing eGPU to work on an NVIDIA GPU. You can hack that framework to allow the driver to load. It has been a while since I looked into this, so perhaps that has changed.
 
  • Like
Reactions: Reindeer_Games
Jul 4, 2015
4,487
2,551
Paris
I would put those deficiencies on Apple more than NVIDIA. All of that stuff works in windows. It’s fairly amazing the green team has offered the continuing support they do considering their last Apple blessed product was the GTX 680.

No 10 bit color with GeForce on Windows. Said this hundreds of times over three years on here.

Don’t blame Apple for those Open CL bugs in the web drivers. We documented on this forum and Tony86 exactly when they started happening. Radeons don’t have the issues.
 

wreck_it_olaf

macrumors newbie
Oct 18, 2018
5
3
I never heard of consumer PC boards using UGA, But some very old HP servers uses UGA.

I think the most plausible explanation is the future Mac Pro or Mac eGPU is a target platform for the RTX series and they're doing their development on an existing Mac Pro since that is the last-best PCIe platform Apple has to offer.

Edit: I got this wrong, the cards were not flashed with an HP utility. I talked to the person who gave me the cards originally and they set me straight.
 
Last edited:

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
I think the most plausible explanation is the future Mac Pro or Mac eGPU is a target platform for the RTX series and they're doing their development on an existing Mac Pro since that is the last-best PCIe platform Apple has to offer.

If developing for Mac eGPU usage, they would likely be testing with TB3 expansion units. There is a market demand for non-AMD solutions here. Apparently the Blackmagic eGPU that includes an RX580 into a docking station sort of combination is a very popular product. Getting an NVIDIA version of this would be a pipe dream for many MBP and iMac/iMacPro users, especially those in post-production houses. Getting Adobe to make their eGPU solution with NVIDIA GPUs would also be an idea, since it seems they are working slightly more closely together these days.

If the MP7,1 includes PCIe slots of some kind, there is no (valid) reason off the shelf GPUs should not make every effort on their end to be compatible. Getting a MP7,1 tower with slots (and internal power) to drive an RTX 2080 (FE or FE Ti) would be the best case scenario for many. Would even make more powerful AMD GPUs an option. Not sure we'll see more than 2-3 PCIe slots, but an expansion chassis of some kind would (should) likely be possible, which then opens the door to multiple GPU driven machine learning applications.

Expectations are high and Apple should know this. They could do a lot to explain their vision at this upcoming NYC event and get pro's (back) on their side. At least offering NVIDIA options would satisfy many professionals. Guess we'll see if they decide to be more transparent or not. I've already been doing 2019 Q2/Q3 budget planning for some clients and something with a price tag that will accompany the MP7,1 will require advanced budget planning. Would be best for Apple to get on that in some fashion at this event if they are targeting a Q1/Q2 2019 release...
 

Yahooligan

macrumors 6502a
Aug 7, 2011
965
114
Illinois
No 10 bit color with GeForce on Windows. Said this hundreds of times over three years on here.

Nothing like beating a dead horse. Kind of like how you've been told if you want 10-bit support from NVidia then you need to run the workstation (Quadro) cards, not the consumer gaming cards. How many times do you need to be told that in order to stop your whining about no 10-bit support with GeForce cards? If you're a professional that needs 10-bit support then either use the Quadro cards or use those "cheap Radeon" cards. If you don't want to do either of those because you enjoy complaining about the same thing over and over then don't expect to be taken seriously. There are solutions, you just seem more interested in complaining.
 

LightBulbFun

macrumors 68030
Nov 17, 2013
2,900
3,195
London UK
HP had a utility to flash supported cards to work with their Itanium systems and those cards would have boot screens on a Mac Pro but were otherwise unusable. I tried an Itanium firmware FireMV 2250 and something else ATI but there were utilities for both ATI and Nvidia.

do you have links to the these utilities? or any more info on it all?

im quite interested since you say the itanium cards gave boot screens on a MP so it must of been an EBC firmware setup just like how ATI/AMD Mac cards where. (since if it was IA64 it would not of worked in an AMD64 MP)

FireMV 2250s are pretty cheap on ebay so it would be fun to get one flash it and play with it some :) also do you recall which NVIDIA cards where UGA flash-able?

if all the MP5,1 needs to display a Graphics output is just a UGA compatible card, and it does not need any IOREG/Apple specific type properties, then Apple should very much be able to implement a GOP driver and get Graphics output from a Radeon RX card.

all in all some pretty interesting developments :)

(also yeah the MV2250 is an RV516 card, the drivers could not auto-int that, so if the card had just UGA and no ioreg properties then yeah despite having boot screens, in OS X it would not of had full functionality luckily these days drivers can int a card without ioreg properties :) )
 
  • Like
Reactions: crjackson2134

wreck_it_olaf

macrumors newbie
Oct 18, 2018
5
3
I would have to look, my recollection was the application had to be run on an Itanium and the only Itanium I have access to is the one that I kept as a paperweight.

You could probably parse some deviceIDs out of the "UGA Console Driver" but the HP Integrity drivers don't appear to be publicly available.

I need to emphasize however, this was completely unusable (and an MV is Radeon X1600 vintage) I was just corroborating that HP did use UGA.
 

LightBulbFun

macrumors 68030
Nov 17, 2013
2,900
3,195
London UK
I would have to look, my recollection was the application had to be run on an Itanium and the only Itanium I have access to is the one that I kept as a paperweight.

You could probably parse some deviceIDs out of the "UGA Console Driver" but the HP Integrity drivers don't appear to be publicly available.

I need to emphasize however, this was completely unusable (and an MV is Radeon X1600 vintage) I was just corroborating that HP did use UGA.

if the Utility is a VBIOS flasher then i can just extract the VBIOS out of it and manually flash it via various means, so I ask again is this utility available somewhere? (I even have an external SPI EEPROM programmer should I have to brute force something)

or secondly do know the HP part number for the itanium specific video card (if I cant flash one then I could try just find an itanium one directly :) )

I dont care about useless, if you say it gives boot screens then it gives me something to play and hack about with :) (plus I do have some ideas how id get one functioning properly in Mac OS X If I was so inclined however being an RV516 card theres no drivers in 10.8 or later)
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Nvidia's product is the RTX series which is almost all implementations will be a PCIe card.

Nearly all supported eGPU expansion chassis units accept 2-Slot PCIe GPUs, like the RTX. TB3 is just the connection protocol to the machine since most Mac's do not have PCIe slots. The missing driver is the only thing really preventing NVIDIA RTX GPUs from properly working with eGPU, with or without a hack. If they are waiting for Apple's approval, this theoretically could be part of the latest web driver...

https://support.apple.com/en-us/HT208544
 

ActionableMango

macrumors G3
Sep 21, 2010
9,613
6,909
No 10 bit color with GeForce on Windows. Said this hundreds of times over three years on here.

Soy, I do believe you, but I am also confused.

I have a 10-bit monitor (actually its an 8-bit panel with dithering, but it announces itself as a 10-bit monitor) and in the Windows Geforce drivers for 1080ti for the parameter Output Color Depth I can select 10-bit color (options are 8 bpc or 10 bpc). Furthermore, it also seems to support 10-bit color for HDR when connected to my TV.

I am certain you know your stuff. Certainly far more than I do, because you work with it and I'm just an enthusiast. But at the very least there seems to be some sort of 10-bit support. Is the Geforce software doing something like my Dell, where it is simply faking 10-bit color by using an 8-bit palette with dithering? Or is the 10-bit support only for something like HEVC video playback? I'm not sure how else I can rectify what I'm seeing with what you are saying, but perhaps you can elaborate.

Personally I leave mine on 8 bpc because I don't do any Pro work and don't need 10-bit color for anything. My understanding is that if you select 10-bit color but use applications that don't support it (which is what I do), the output will actually be less color accurate. But I'm not sure that's the right thing to do either, and I haven't investigated.
 

wreck_it_olaf

macrumors newbie
Oct 18, 2018
5
3
if the Utility is a VBIOS flasher then i can just extract the VBIOS out of it and manually flash it via various means, so I ask again is this utility available somewhere? (I even have an external SPI EEPROM programmer should I have to brute force something)

or secondly do know the HP part number for the itanium specific video card (if I cant flash one then I could try just find an itanium one directly :) )

I dont care about useless, if you say it gives boot screens then it gives me something to play and hack about with :) (plus I do have some ideas how id get one functioning properly in Mac OS X If I was so inclined however being an RV516 card theres no drivers in 10.8 or later)

I will try to get you that information for you or better yet I will try to get you the card. I cannot find the utilities on the HPE website however I don't have an Integrity support entitlement.

I just emphasize it is useless because I don't want casual observers buying the cheap card and then getting pissed when it freezes on a white screen.

I don't think there were very many Itanium specific video cards (if you think Mac video cards are a small market) I only actually recall a single weird one that was a Mobility Radeon on a card that also for whatever reason had two USB ports and everything else was flashed.
 
  • Like
Reactions: LightBulbFun
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.