Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As has been commented by none other than Apple's current darling Matt Panzarino, the GTX1080 was a pretty watershed product for content creation purposes. In terms of price, performance, heat and power draw, AMD still doesn't have anything to compare. Look around, there are plenty of "I migrated to Windows to get access to GTX1080s" from formerly Mac-based video producers.

That's gotta annoy Apple, when the absolute best they can get out AMD for their most expensive desktop system, is equivalent to the Laptop GPUs their competitors can access.

I am not sure about envy. Or, being annoyed that the other "camp" has something more powerful. But, may I go back to speculation?

Maybe, that's why they put the Vega in an iMac and called it Pro... sealed it and then delivered it. It could be a hint that Apple is trying to contractually fulfill their obligation with AMD. Thus, the iMac Pro was born. No one ever saw the iMac Pro coming. Not even the rumor mill that is Macrumors. So, it could be a kind of hint of proof that Apple put AMD's best in an iMac. But, Apple was like, let's call it Pro. As an interim product and as a product in which they contractually fulfilled whatever obligation they had with AMD. Two stones with one bird, sort of deal.

And, then of course, the 2018 MBP were also refreshed with yet the same Polaris GPU's pointing or hinting more of the same... meaning Apple is ready to move on.... and that Apple is ready to jump from the AMD ship whom they "publicly-privately" spoke to as to why the mac pro has been neglected.

My theory is that by publicly blaming themselves, Apple is also privately speaking to AMD?....

So, I wouldn't be surprise and it would be a welcome change for Apple and for the modular Mac Pro to sport not an AMD GPU but an NVIDIA Turing GPU. Not because of annoyance or envy. But, because of actual I'm over it with AMD as I tried to say in my prior post and this post with the whole pointing the finger at themselves thing....
 
That's gotta annoy Apple, when the absolute best they can get out AMD for their most expensive desktop system, is equivalent to the Laptop GPUs their competitors can access.



That is because Apple chose not to pay for AMD's workstation cards. Go take a look at the WX series of cards.
 
Apple's love affair with AMD is all about the cost per unit, AMD just flat cuts them a better price than nVidia does.

Those of us that have been around long enough remember how nVidia got it's start in the Mac space, someone at ATI leaked, or let slip, an upcoming, yet to be announced Apple product. Stealing Steve Jobs's thunder at Mac World.

Steve was so pissed that he turned to nVidia for future products. At the time nVidia made no GPU's for the Mac, and got a sweet OEM deal out of the debacle.

Tim Cook is a supply chain master, whoever can give Apple a better price for a product that is good enough, that's who they will go with.

They are not trying to build product that last forever, or can easily be upgraded to the point where their costumers forgo buying new Macs. They just want them to last longer than the warranty, and hope the costumer can still get some value selling their old unit in the resale market to offset the cost of their next Mac.

The nMP was the best thing to ever happen for cMP owners.
 
^^ want to start this with apple makes money from iPhones/iPhone ecosystem (app shop/itunes etc) id gess next is 13" laptops. so i relay dont think "big/fat" gpu's is something apple cares about at most i think they may just make osx more friendly to PC cards for egpu?
(is this just so devs can make apps for 2-5 years down the line? iphones/13" laptops may be have gpu's as fast as today's big gpu's by then)

@DearthnVader cost sounds relay likely to me and AMD seem to be moving to making lots of smaller gpu's for ps4/xbox/intel/apple
it's a market and looks like they have a death grip on it :D, also id gess AMD will be more likely to make a GPU to fit what apple asks for at a low cost than nvidia who is making all that sweat sweat AI gpu money :cool: under £10K they dont need to care

id be amazed if we dont see one of the intel/AMD cpu/gpu things in a apple product some time too
https://arstechnica.com/gadgets/201...intel-cpu-amd-gpu-nvidia-beating-performance/
that thing

edit
also Navi may be ok in the end, saw somewhere online most the staff where pulled from vega on to Navi for sony so it may end up as a good chip
https://www.forbes.com/consent/?toU...nys-playstation-5-vega-suffered/#2d485f1424fd
"Allegedly, Koduri saw up to a massive 2/3 of his engineering team devoted exclusively to Navi against his wishes, which resulted in a final RX Vega product Koduri was displeased with"
 
That is because Apple chose not to pay for AMD's workstation cards. Go take a look at the WX series of cards.

well yeah, but Apple's never gong to put an actual workstation card in a machine. I mean when you look at the iMac Pro, the price differential in upgrading from the standard Vega 56 to the Vega 64 is roughly the retail price of a Vega 64. Pay for two cards, and only take delivery of one.
 
well yeah, but Apple's never gong to put an actual workstation card in a machine. I mean when you look at the iMac Pro, the price differential in upgrading from the standard Vega 56 to the Vega 64 is roughly the retail price of a Vega 64. Pay for two cards, and only take delivery of one.

Yet another example of how Apple has borked the use of the word "Pro". Cthulhu forbid having a video card with a 10 year warranty in their "Pro" box.
 
  • Like
Reactions: Manzanito
I don't know where the whole "Nvidia hates Apple/Apple hates Nvidia" thing started. But, may I point to something palpable, something that was said where "Apple might hate AMD?"

I think Apple might hate AMD from them officially blaming themselves for designing themselves into a thermal-corner with the trashcan Mac Pro.

So... wait, Apple blames themselves not AMD.

I don't think you're completely wrong. Apple doesn't really like any outside company right now. Which is why I was talking about Apple doing their own GPUs. Apple could replace AMD either by buying them or the Radeon division and taking direct control, or licensing and/or continuing the A series GPUs and just doing things themselves.

But on the topic of AMD vs Nvidia: One will do anything Apple wants for not very much money, and the other won't do anything special for Apple at any price. If if Apple wasn't completely with AMD, which one do you think Apple likes more?
 
....
edit
also Navi may be ok in the end, saw somewhere online most the staff where pulled from vega on to Navi for sony so it may end up as a good chip
https://www.forbes.com/consent/?toU...nys-playstation-5-vega-suffered/#2d485f1424fd
"Allegedly, Koduri saw up to a massive 2/3 of his engineering team devoted exclusively to Navi against his wishes, which resulted in a final RX Vega product Koduri was displeased with"

Pulled or transitioned to the next generation? The article mentions that Navi is mentioned in the 'past tense'. It isn't even out yet for at least probably 6 months. And yet many folks are 'gone'. Timing is what is missing from article. It may have been the time for them to move according to a prior roadmap.

Vega for the entry-mid tier has some major problems. HBM2 cost factors ( and performance) haven't dropped ( and increased ) like folks expected it to. The very new WX 8200 just the HBM2 data rates they expected in the first place using some "gen 2" HBM2 chips.

So if Vega is not going to work out for your entry-mid and Polaris is only going to get a modest bump going from 14nm -> 12 nm, then AMD has a major hole in their line up. They need a Polaris replacement badly. It looks like Navi will plug that hole first and Vega will be more gradually phased out ( with an increment 'shrunk' to 7nm to extend the time. ). There is a decent chance that the initial Navi is coupled to GDDR6 and that will help uncork some of the costs issue's in the 2019-2020 timeframe. AMD can add a 'bigger' variant later with 'gen 3' HBM2 for "top end".

It easily could have been more Vega shrink (small, targeted (at high end/margin; e.g., AI) tweaks, but shrink what they had), than Vega do 3-4 things/markets at the same time.

Sony ( and probably also Microsoft and Apple ) probably contributed some to the decision, but I think it is a bit overblown that Vega was "thrown under the bus". Decent chance that Koduri wanted to hold them back from moving one to have a larger "pot hole" fixing work done as opposed to moving on. Given that AMD was financially just trying to keep the lights on, the call that got made was probably the right one.

A more healthy AMD would have more folks to do deeper pipelining so would be able to avoid less "rob Peter to pay Paul" between generation efforts. They weren't so they had to made to with fixed number of resources they had. If AMD manages to exit the cryptocurrency revenue bubble without stumbling then they should be able to ramp to doing broader, parallel efforts. ( the crypto revenue bubble made it substantively safer to bet on shifting work to future archs because folks were going to buy up everything no matter what as long as the bubble lasted. )


Stuff happens. Even to Nvidia. That is why having no open slot doesn't make much sense. But that new, additional, optional compute. The primary boot display can have other criteria. Trying the put the boot display criteria on the 2nd slot is as equally flawed as vice versa. Those should be decoupled as they are different functions in the context of TB.

Thunderbolt's Video + data is the ultimate evil? Chuckle, going to have to come up with some new rant because that's what appears to be on these RTX cards. A VirtualLink port. " ... Hardware support for USB Type-C™ and VirtualLink™(1), " [ No doesn't mean TB is coming to standard video cards anytime soon. 3 inputs needed PCI-e , DP , and GPIO. These cards could bleed some PCI-e bandwidth to a plain usb controller and fill the VirtualLink port. ]
 
A lot of people seem to be explaining why Apple would never go with Nvidia GPUs again. That's not what OP asked for in the first post:

If Apple is really serious about professionals, these cards need to be supported in the next MacPro, so that means slots.

OP is asking for PCIe slots. We should be able to put whatever card we want in there, AMD or Nvidia. Or no card. Or multiple cards. Or pro cards. Or cheap cards. And to upgrade cards later.

Apple switching from locked-in-AMD to locked-in-Nvidia (let's say proprietary "D1080" boards) might make a few nMP-loving CUDA heads happy, but it isn't what most of us want.
 
A lot of people seem to be explaining why Apple would never go with Nvidia GPUs again. That's not what OP asked for in the first post:

OP is asking for PCIe slots. We should be able to put whatever card we want in there, AMD or Nvidia. Or no card. Or multiple cards. Or pro cards. Or cheap cards. And to upgrade cards later.

Apple switching from locked-in-AMD to locked-in-Nvidia (let's say proprietary "D1080" boards) might make a few nMP-loving CUDA heads happy, but it isn't what most of us want.

Sure, PCIe slots mean you could put anything you want in.

But aren't the drivers their own barrel of monkeys? Apple may not bundle the drivers, Nvidia may not make them, and even if they do, they may not be very good drivers (as most Nvidia driver releases these days have not been very good.)

I'm not saying Nvidia couldn't do the drivers all on their own. They have enough access to do it. I'm just saying that the big reason Nvidia put so much effort into their Nvidia drivers was that Apple gave them a lot of money to do it. Now that Apple is not giving them money to write drivers, their drivers are kind of meh.

Maybe Nvidia will think they can make so much money from Mac Pro and eGPU users that they'll put in the effort. I dunno. But even in their Mac glory days they were taking a lot of money from Apple.
 
If there are PCIe slots in 7,1 and NVIDIA continues to make web drivers for Mac with support for the latest cards (or at least a subset of XX70 & XX80) then I see no reason not to be satisfied at least in the short term.

Longterm this really depends on what the 7,1 looks like and if the aftermarket GPU is only via eGPU or through other options (PCIe directly). If it's only through eGPU, it likely would require NVIDIA creating an installer to workaround the current eGPU limitations, unless Apple decides to support additional cards (with vendor-provided drivers).
 
A lot of people seem to be explaining why Apple would never go with Nvidia GPUs again. That's not what OP asked for in the first post:
...
OP is asking for PCIe slots. We should be able to put whatever card we want in there, AMD or Nvidia. Or no card. Or multiple cards. Or pro cards. Or cheap cards. And to upgrade cards later.

There is a huge functional gap in the Mac space between the primary displace GPU and the above. "Or no card" would leave the system GPU less. macOS primarily core value add is being a graphical user interface. No GUI then not aimed at the core Mac market. That "no Card" could be thrown onto the other slots ( if present) with little to no impact on core Mac value add.

Where this discussion go off into the weeds is conflating what the primarily GPU functionality constraints are with the other lane budget and/or allocation to PCI-e standard slots. ( and often to commodity pricing... ). The implicit core issue really boils down to control; not cards. Scope to undo as much as possible of what Apple selected. In that context, what Apple who and won't pick is pretty much getting at the same core issue. ( it is often used as a rational of 'why' they want control. )

The OP asked for slots. The classic, generic PCI-e large card slot isn't the only slot that uses PCI-e.

Apple switching from locked-in-AMD to locked-in-Nvidia (let's say proprietary "D1080" boards) might make a few nMP-loving CUDA heads happy, but it isn't what most of us want.

if the industry came up wit ha slot that put 4 DisplayPort lanes of output into some connector to the card ( either coupled to PCI lanes attach point or a card with two attach points. ) Apple would probably pick it up. The dogma that GPU cards have to look almost exactly like they did 15 years ago will probably keep that standard from coming about. ( unless Apple worked to establish it. If the audience is dogma calcified it may not be worthwhile.)

OCD dogma on Apple's side too though as to not have to put all the constraints of the primary GPU 'slot' onto the others. If Apple did do a 'custom' card all of those specifics don't need to necessarily propagate to the other slots. Just because have an M.2 slot shouldn't necessarily make all the classic standard slot form factors go away. PCI-e isn't a single physical form. Both sides need to ponder that a bit more deeper.

There are a fixed number of PCI-e lanes (and bandwidth) and several things to allocate them too. Everybody isn't going to get everything possible they want. Find a balance and ship. Apple is the primary design though so their input on what balance is going to matter. They take input, but the last and final call is theirs; so their viewpoints will matter.
[doublepost=1534446420][/doublepost]
....
Longterm this really depends on what the 7,1 looks like and if the aftermarket GPU is only via eGPU or through other options (PCIe directly). If it's only through eGPU, it likely would require NVIDIA creating an installer to workaround the current eGPU limitations, unless Apple decides to support additional cards (with vendor-provided drivers).

Apple's initial supported card list is far more so about GPU drivers they have already recently done as embeds and are Metal aligned then vendor. I'd be very surprised if Apple tried to make access to the eGPUs punitive. Right now they need to make it work and be solid ( smaller scope over well known drivers ) and then next iteration can work from that. Yet another macOS release with a bug fest isn't what they need here.


If Apple tries to choose external PCI-e enclosures as a huge 'punt' for the next Mac Pro then the software issues around those cards doesn't really go away.
 
The dogma that GPU cards have to look almost exactly like they did 15 years ago

OBVIOUSLY the reason why people ask for PCIe slots is because that's the industry standard and that's how GPUs are packaged. NOBODY asking for PCIe cards has stated that they want them because they like the specific physical characteristics of an ancient design. To imply otherwise is both wrong and insulting.

If the industry standard was some sort of new AGP slot, we'd be asking for that. If the industry standard were some sort of new MXM socket, we'd be asking for that. If the industry standard were whatever the D300/500/700 connector is called, we'd be asking for that.

There is no dogma that GPU cards have to be packaged the same as always. We would love a modern design, provided that it is an industry standard that everyone uses.

I have a feeling this is the second or third time you've accused me of having this backward intent, so I'm saying for the record, knock it off. If you do it again to me, you'll just be a lying troll.
 
I don't think you're completely wrong. Apple doesn't really like any outside company right now.

Any company is a bit way too broad. Companies that compete with core competency that Apple has internally I think is more of an issue. As Apple makes more of a full stack of a GPU + drivers then there will be less tolerance of vendor telling them that Apple has to do what the vendors says.

Apple may grumble about Foxconn but they really, really, really don't want to be in the business of actually making stuff. Some of there contractors are doing summersaults to make Apple happy. I suspect they like those. The ones that leak secrets like a sieve and are difficult to haggle with with then probably not.

Apple is screwing up a bit though with use the Scrooge McDuck money pit to hammer vendors over the head with though. Sometimes vendors don't like them because Apple isn't very likable.

Which is why I was talking about Apple doing their own GPUs. Apple could replace AMD either by buying them or the Radeon division and taking direct control, or licensing and/or continuing the A series GPUs and just doing things themselves.

For the laptop stuff perhaps. But at the top end there is no volume to get return on investment. Apple could put something on that Intel+AMD package solution that is stop-gapped now. I think Intel intends to fill that gap with their dGPU but if it failed Apple could perhaps push something in there.

However, buying up all of AMD or even just Radeon (if AMD was somehow suicidal ) wouldn't have scale to be "just" a small-mid range GPU component fill. Apple could flush a couple billion down the drain doing it, but I bet Buffett will be way more than pissed off. Probably the only way to make money back on buying AMD would be to largely do what AMD does now ( better). If AMD was a large 'hole in the ground' as far a being a viable company and there was a huge firesale maybe. But as mostly functional company it won't work.



But on the topic of AMD vs Nvidia: One will do anything Apple wants for not very much money, and the other won't do anything special for Apple at any price. If if Apple wasn't completely with AMD, which one do you think Apple likes more?

One of the problems here is that Apple would rather have 3 major supplier options. That way even if put one in the 'penalty box' for some violation then still have more than one to seek competitive bids from. That's what is wrong with this whole "well AMD isn't perfect either so why not dump them too". Can't dump everybody. AMD may have tried and come up short. Apple can't punish them for trying and they were honest in updating about problems/issues. Especially, if they don't have many other options.

With AMD, Apple gets a 'two for' as a major supplier candidate. If Apple pulled all there money out of AMD because there were forever 'doomed'. At this point Apple wouldn't have Intel competing as hard as they are now. Neither would Nvidia (even as a non-selection there is competition component) . Which wouldn't be good for a lot of other companies either.

MS screwing up Windows for a while there really didn't help macOS grow better long term. Apple got some easy wins for a bit too long and appears to have gotten kind of lazy.
 
I just wish Apple would work with AMD about getting their GPU software together.

If I am running Windows an a WX series of cards, I can switch between compute video drivers and gaming video drivers. Why the hell can't I do that with OSX drivers?
 

Okay. I just used my own recollection since I do consider myself a macrumors regular. But, I guess I was wrong or not as in-tuned as I had imagined myself to be. Maybe, it was in the iMac section, which is a section I don't usually peruse in. Whatever the case may be.... Oopsie-daisy! But, kudos to that OP who predicted an iMac Pro a year before it was even announced. Although, the CPU that ignited his post and curiosity wasn't the correct CPU that the iMac Pro would eventually carry. Regardless, he somehow knew that Apple was going to update the iMac into a workstation-level computer. I guess, it was in the air at the time? How did he put the two together?

Anyway, yeh, I still hope that the reason Apple decided to put AMD Vega in an iMac is some kind of hint that Apple is waiting for something more "special" to put in the 2019 Mac Pro 7,1? One can hope!

Although, I really don't know what benefit an Nvidia Turing GPU would bring, except its great thermal-power-performance ratio.

And, AMD time and again seems to under-deliver.
 
I don't know if anyone has caught on yet, but in nVidias's teaser video it does appear they are hinting the RTX 2080 could be available for the Mac. Notice the order of the chat? MAC-20-Eight-Tee?


 

Attachments

  • MacRTX2080 - Copy.jpg
    MacRTX2080 - Copy.jpg
    138.6 KB · Views: 249
Apple switching from locked-in-AMD to locked-in-Nvidia (let's say proprietary "D1080" boards) might make a few nMP-loving CUDA heads happy, but it isn't what most of us want.

I don't think it would make any of us happy tbh. Plus, it's probably far more likely we have a repeat of the Mac Pro D300 fiasco of faulty GPUs all over the place.

I don't know if anyone has caught on yet, but in nVidias's teaser video it does appear they are hinting the RTX 2080 could be available for the Mac. Notice the order of the chat? MAC-20-Eight-Tee?

I saw that too. I want to believe, which is why I'm holding off on buying a 1080 Ti on the slim chance that there is a Mac EFI RTX 2080...
 
I just wish Apple would work with AMD about getting their GPU software together.

If I am running Windows an a WX series of cards, I can switch between compute video drivers and gaming video drivers. Why the hell can't I do that with OSX drivers?

Because it's the same driver in MacOS and completely transparent to the end-user.
 
Because it's the same driver in MacOS and completely transparent to the end-user.

But they are different drivers in Windows - Optimized compute drivers, Optimized Gaming drivers. I am a member of the Right tool for the job camp as opposed to the One tool for every job camp.
 
The closest to driver switching on the Mac side is the NVIDIA Web Driver, which basically lets you choose between the MacOS driver and their driver. Doubt AMD will ever make a Mac tool like this.
 
  • Like
Reactions: ActionableMango
The closest to driver switching on the Mac side is the NVIDIA Web Driver, which basically lets you choose between the MacOS driver and their driver. Doubt AMD will ever make a Mac tool like this.

You are probably right - why would Apple make professional drivers available for their "Pro" hardware :(.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.