Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
****in ****in ****! Please let the rumors be false. I intend to buy new gear this fall and was expecting a mp by now . Or at least at wwdc. Now I have horrid flashbacks of the waiting game from 2015 and forwards….
Apple absolutely needs to at least address the mac pro by wwdc this year or I don’t know why I even wait
Even if Apple announce a new MP at WWDC, they'll almost certainly let it sit for years with no updates, price reductions or communication, so the waiting game isn't going anywhere.

If you want a new laptop, though, I'm sure you'll have nothing to worry about - you can expect yearly updates.
 
  • Haha
Reactions: maikerukun

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
It really is pretty crappy that Apple can't put aside the beef and make it easy for AMD and NVIDIA to write drivers for macOS with ease.
I think the 'beef' factor is overblown.

Apple don't have any beef with AMD (AFAIK) - it's just that ASi doesn't support PCIe GPUs. If Apple ever intended it to do so, they could have used the last few years to refine the software stack for it, by enabling support for TB eGPUs (which are essentially the same thing). The only Mac that could support a new AMD GPU at this point would be the 2019 Mac Pro, and they hardly want to prolong its lifespan / raise the bar for its successor.

Apple probably wouldn't deal with Nvidia because a) as the market leader, they won't give Apple the volume discounts AMD will, and b) they won't agree / can't be trusted to keep CUDA off macOS, which would be unwanted competition for Metal. And as ASi doesn't support PCIe GPUs, it's moot now anyway.
 
Last edited:

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I think the 'beef' factor is overblown.

Apple don't have any beef with AMD (AFAIK) - it's just that ASi doesn't support PCIe GPUs. If Apple ever intended it to do so, they could have used the last few years to refine the software stack for it, by enabling support for TB eGPUs (which are essentially the same thing). The only Mac that could support a new AMD GPU at this point would be the 2019 Mac Pro, and they hardly want to prolong its lifespan / raise the bar for its successor.

Apple probably wouldn't deal with Nvidia because a) as the market leader, they won't give Apple the volume discounts AMD will, and b) they won't agree / can't be trusted to keep CUDA off macOS, which would be unwanted competition for Metal. And as ASi doesn't support PCIe GPUs, it's moot now anyway.

No. Nvidia screwed up. Apple stopped signing their "halt and catch fire on new macOS version" GPU drivers for macOS on Intel. That has little to do with macOS on Apple Silicon. Nvidia did a variation of "embrace , extend, extinguish" on OpenCL (an initial Apple creation). Nvidia didn't highly enthusiastically support Metal. Nvidia stuck Apple with the whole bill for some faulty iGPUs.

Nvidia leveraged their "market leader" status to act like a lousy component subcontractor that one had to deal with no matter what. Apple walked away. They don't deeply need Nvidia financially ( overwhelming vast majority of Macs don't need their dGPUs. And rest of product line certainly does not). Nvidia doesn't need Apple. ( EVGA quit the graphics card biz. Nvidia isn't on many vendors "you are a great partner" Christmas card list.)

Nvidia burned up the bridge behind them .

AMD has 'fumbled the ball' more than several times. , but at least haven't actively dug a hole so deep that Apple revoked their driver signature privilege. They have had driver support team embedded inside of Apple. They don't do drivers outside the macOS development path. etc. However, AMD's performance per watt has been questionable for most of the last decade. The path from Fury -> RNDA 2 was rife with hiccups and delays. ( Yes AMD was stuggling to keep the lights on during the early part of that decade , but wasn't winning them a "We should bet the future on these guys" winnings with Apple. Apple tolerated because in part they had leverage but it opened the door for major investments in alternatives. ) . The 580X was in the MP 2019 initial line up because AMD really didn't have a much better modern offering to serve in that roll at the time. While not intentionally limiting , AMD's OpenCL efforts were disjointed and quirky on macOS early on.


If AMD had been executing on driver quality and implementations in 2014-2017 as well as they are now they may have had more influence on the macOS on Apple Silicon path, but they were not. Not even close. For over the last decade Intel has been biggest GPU vendor on macOS and they got tossed with the CPU.


Without the much broader base of embedded dGPU into other Mac products there is a larger disconnect. AMD got dropped. It isn't a 'anger' beef, but they certainly failed the design bake-off to get a 'win' in the rest of the Mac product line.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
No. Nvidia screwed up. Apple stopped signing their "halt and catch fire on new macOS version" GPU drivers for macOS on Intel.
Were Nvidia's web drivers that bad? There seem to be a fair number of people in this forum using Nvidia GPUs in Mac Pros, obviously on older versions of macOS. Having to download separate drivers was probably also a symptom of Apple's toleration rather embrace of Nvidia GPUs by that point.

That has little to do with macOS on Apple Silicon.
Yes, I said that it's moot, as ASi doesn't support non-SoC GPUs at all.

Nvidia did a variation of "embrace , extend, extinguish" on OpenCL (an initial Apple creation). Nvidia didn't highly enthusiastically support Metal.
Yes, I also specifically mentioned CUDA.

Nvidia stuck Apple with the whole bill for some faulty iGPUs.
Are you referring to the 8600M (a dGPU)? I had one of those MBPs, and got a new logic board for free. My understanding is that Nvidia eventually coughed up for GPU replacements, as they had to do for Dell etc. as well.

Nvidia leveraged their "market leader" status to act like a lousy component subcontractor that one had to deal with no matter what. Apple walked away. They don't deeply need Nvidia financially ( overwhelming vast majority of Macs don't need their dGPUs. And rest of product line certainly does not).
Well, unlike the PC market, where customers have a choice, Mac users can only buy what Apple supply / support. No PC OEM could get away with refusing to supply Nvidia GPUs, but Apple don't have that problem.

Nvidia doesn't need Apple. ( EVGA quit the graphics card biz. Nvidia isn't on many vendors "you are a great partner" Christmas card list.)

Nvidia burned up the bridge behind them .
Yes, I said as much in my post. Difficult to strong-arm the market leader. AMD clearly need the business more - they supply all the game consoles too.

AMD has 'fumbled the ball' more than several times. , but at least haven't actively dug a hole so deep that Apple revoked their driver signature privilege. They have had driver support team embedded inside of Apple. They don't do drivers outside the macOS development path. etc. However, AMD's performance per watt has been questionable for most of the last decade. The path from Fury -> RNDA 2 was rife with hiccups and delays. ( Yes AMD was stuggling to keep the lights on during the early part of that decade , but wasn't winning them a "We should bet the future on these guys" winnings with Apple. Apple tolerated because in part they had leverage but it opened the door for major investments in alternatives. ) . While not intentionally limiting , AMD's OpenCL efforts were disjointed and quirky on macOS early on.
Regardless, Apple has yet to produce a GPU that is anything close to what can be bought off the shelf from AMD. Even a single RX6800 crushes the fastest ASi GPU, never mind Duos or the 7900 series.

If AMD had been executing on driver quality and implementations in 2014-2017 as well as they are now they may have had more influence on the macOS on Apple Silicon path, but they were not. Not even close. For over the last decade Intel has been biggest GPU vendor on macOS and they got tossed with the CPU.


Without the much broader base of embedded dGPU into other Mac products there is a larger disconnect. AMD got dropped. It isn't a 'anger' beef, but they certainly failed the design bake-off to get a 'win' in the rest of the Mac product line.
Ultimately, the transition to ASi is a business move. Apple invest a lot of money in iPhone SoC development, where they lead the mobile / tablet market by some margin. They clearly saw the cost savings of moving macOS to ASi, where they could largely leverage existing investment and wouldn't be paying anyone else's margins. It remains to be seen whether this will work out long term, or whether Apple were reckless with the Mac market as it's a small part of their business.

Given ASi is a mobile-first, SoC architecture, it may well be impractical to add dGPU support of any kind without fundamentally reworking it - which doesn't seem likely for a few niche desktops. Hopefully we'll find out one day.
 

PineappleCake

Suspended
Feb 18, 2023
96
252
No. Nvidia screwed up. Apple stopped signing their "halt and catch fire on new macOS version" GPU drivers for macOS on Intel. That has little to do with macOS on Apple Silicon. Nvidia did a variation of "embrace , extend, extinguish" on OpenCL (an initial Apple creation). Nvidia didn't highly enthusiastically support Metal. Nvidia stuck Apple with the whole bill for some faulty iGPUs.

Nvidia leveraged their "market leader" status to act like a lousy component subcontractor that one had to deal with no matter what. Apple walked away. They don't deeply need Nvidia financially ( overwhelming vast majority of Macs don't need their dGPUs. And rest of product line certainly does not). Nvidia doesn't need Apple. ( EVGA quit the graphics card biz. Nvidia isn't on many vendors "you are a great partner" Christmas card list.)
I am going to stop being Apples customer because they ***** up the MacBool Pro 2016 era and the Mac Pro 2013/2019(where is RDNA3)?

It's business, stop with your petty feelings Apple. Apple made countless mistakes. You know what, if Apple does not make a 4090 like GPU I am leaving. They have the money do so, if they don't they just don't care.

Appl slaps their logo and charges premiums when their products are not faster. There are plenty companies who use Nvidia to this day and Apple choosing not is them believing their gfx is superior.
 

maikerukun

macrumors 6502a
Original poster
Oct 22, 2009
719
1,037
I really look at this is good news. Means we won’t get some bastardized M2 ultra version of this. There’s a really good chance we can get an M3 Xtreme version with slots and other things we all really want. I’d rather it take longer and them do it right.
100% agree with this. In Taipei right now. Gonna tell you this, EVA AIR business class to Taipei is fantastic. If you're ever traveling that way.

But yes I 100% agree, perhaps an M3 Extreme with 3rd party GPU support or even quad Apple cards with support to add 4 - 8 more cards can be something amazing. Looking forward to it for sure.
 
  • Like
Reactions: ZombiePhysicist

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
100% agree with this. In Taipei right now. Gonna tell you this, EVA AIR business class to Taipei is fantastic. If you're ever traveling that way.

But yes I 100% agree, perhaps an M3 Extreme with 3rd party GPU support or even quad Apple cards with support to add 4 - 8 more cards can be something amazing. Looking forward to it for sure.

At the very least, I'm getting more psyched that we will have a WWDC with some true surprises for a change.

Also, this is probably the first WWDC where I truly do not care about macOS or iOS in any way at all. Both have become so unloved and disheveled by Apple that they've managed to make me not care about something so dear; it's truly sad. I hate to admit it, but I feel that way. I'm hoping the Mac Pro and the glasses will at least jazz me up, because they've so 'phoned it in' (pardon the pun) for so long, that they've managed to make me not care.
 
  • Love
Reactions: maikerukun

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
At the very least, I'm getting more psyched that we will have a WWDC with some true surprises for a change.

Also, this is probably the first WWDC where I truly do not care about macOS or iOS in any way at all. Both have become so unloved and disheveled by Apple that they've managed to make me not care about something so dear; it's truly sad. I hate to admit it, but I feel that way. I'm hoping the Mac Pro and the glasses will at least jazz me up, because they've so 'phoned it in' (pardon the pun) for so long, that they've managed to make me not care.

Not to take this thread off topic, but this is what I believe the spectacles will resemble (if not exactly):


Oh and that image that they're previewing for WWDC -- along with the semicircle being that inner theater that lives within the mothership, it's also a reference to pancake lenses...
 

vinegarshots

macrumors 6502a
Sep 24, 2018
982
1,349
Not to take this thread off topic, but this is what I believe the spectacles will resemble (if not exactly):


Oh and that image that they're previewing for WWDC -- along with the semicircle being that inner theater that lives within the mothership, it's also a reference to pancake lenses...
There is nothing to make us think that Apple has “spectacles” lined up— if that’s what you’re expecting, I think you’re going to be highly disappointed
 

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
There is nothing to make us think that Apple has “spectacles” lined up— if that’s what you’re expecting, I think you’re going to be highly disappointed

Except that apple has been working with Stanford on the technology for quite some time now

And since they've hired guys like this

I don't expect the actual glasses to come out this year though, per the rumors including their own design team warning them to wait until they're ready. I do expect it to be some stupid VR goggles because Timmy is desperate, but if they are glasses, you bet your ass they will be just like those nreal.

We'll see soon enough, no reason for you to get upset...

go take some more vinegarshots.

edit: HERE is more evidence of pancake lense technology.
 
  • Haha
Reactions: maikerukun

Chancha

macrumors 68020
Mar 19, 2014
2,313
2,141
Except that apple has been working with Stanford on the technology for quite some time now

And since they've hired guys like this

I don't expect the actual glasses to come out this year though, per the rumors including their own design team warning them to wait until they're ready. I do expect it to be some stupid VR goggles because Timmy is desperate, but if they are glasses, you bet your ass they will be just like those nreal.

We'll see soon enough, no reason for you to get upset...

go take some more vinegarshots.

edit: HERE is more evidence of pancake lense technology.
Most makers out there are using closed VR headsets but with lenses in front, so that even though it technically is not a glass see-through style of device but it acts like one. Kind of like how a mirrorless camera with its EVF. This kind of approach is apparently much more mature since it is more or less just 2 phones with rare cameras arranged in stereo, with some extra sensors in and outside.

It seems Apple will throw this VR/AR mixed goggle first. The "spectacle" style is the one that the designers have wanted to release (the true AR glass), where the one we may be seeing this year is the one they want Apple to not release. That's how I interpret the rumors / news.
 
  • Like
Reactions: maikerukun

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I am going to stop being Apples customer because they ***** up the MacBool Pro 2016 era and the Mac Pro 2013/2019(where is RDNA3)?

You are free that make that choice. Apple is also free to make a choice who they buy from and/or "spend money on" also.


Where is RNDA3 ? ... Yeah where is the 7700 , 7600 , 7500 ? As far as the family goes AMD is dribbling this out very slowly. The glut of new/used past generation cards (and their dramatically sinking prices) is a contributing reason to why AMD is going so slow. Apple same thing. rapidly sagging 6800/6900 prices... for MP folks moving up from a 5700 are they going to spend more or get bargain hunt in "leaner times" for a card with better price/performance? And hefty fraction of folks grumbling about RNDA3 only really want to buy an off the shelf (non Apple MPX ) product to side step paying for drivers. Is Apple really going to be in a big hurry to rush into that situation? Probably not.

A major contributor to Apple's adoption of using a new architecture family of AMD (or Intel's or Nvidias) was adopting a major fraction of the family into new Mac products. (not one solitary product... multiple ones. ). Apple dropping AMD from the many millions of Macs sold is going an impact. ( the Mac Pro all by itself isn't anywhere near 'unit millions' at all. )

It's business, stop with your petty feelings Apple. Apple made countless mistakes. You know what, if Apple does not make a 4090 like GPU I am leaving. They have the money do so, if they don't they just don't care.

At one point Apple sued Samsung for "stealing Apple IP" and yet Samsung screens and components appear in Apple products today. This isn't about minor petty feelings. Samsung is still a component subcontractor to Apple because part of Samsung didn't try to screw Apple and pretty much followed the strategic and tactical directives. Nvidia didn't.

This about cases where Nvidia essentially said "F you" to Apple and explicitly went against Apple's strategic and tactical directives to do whatever was solely best just for Nvidia. That is a lousy business practices for what is essentially a sub-contractor for an individual component.

A subset of folks spin that the GPU is the only critical component that matters in a system. It is only in that myopically narrow subset where what Nvidia did was 'petty'. However, those folks also don't 'own' macOS or the Apple ecosystem. The Nvidia fan boys solely pointing the finger at Apple ( where Nvidia also explicitly egged them on to point to... another nail in the coffin. And again something that Samsung component suppliers never did. ) only makes the situation gets worse.


Does Apple put "pain the butt" constraints on their suppliers/partners also? Yes. But generally there is good and bad where the good mostly balances out the 'bad'.


Appl slaps their logo and charges premiums when their products are not faster. There are plenty companies who use Nvidia to this day and Apple choosing not is them believing their gfx is superior.

There is little indication that Apple thinks their GFX is across the board superior in every situation. In embedded/iGPUs it is on perf/watt criteria.

Where Nvidia's antics attack and subvert Metal ( and some other areas that are cross product strategic to Apple) is something most of the "plenty of companies who do use" don't have an entanglement with Nvidia. And Nvidia doesn't try to throw some of them under the bus. Full front direct "embrace , extend , extinguish" attack on DirectX .... didn't happen. There is a détente workaound for Nvidia's closed source nature and base Linux distributions where they don't overlap in objectives much.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053

Errr. from the article referenced.

" .. Apple Glasses are further ahead, and little is known about these as yet. We’re unlikely to see these before 2025, if then. ..."

The article isn't about "Apple Glasses".


And since they've hired guys like this

I don't expect the actual glasses to come out this year though, per the rumors including their own design team warning them to wait until they're ready. I do expect it to be some stupid VR goggles because Timmy is desperate, but if they are glasses, you bet your ass they will be just like those nreal.

Apple's Goggles focus isn't primarily on VR. The first referenced article mixed/AR/VR solution. Not really the glasses.


The more cameras have to lean on to provide reality the less likely you are going to play them into being practical inside the the standard frame size of normal glasses. Also the more fancy 'gee-whiz' electronics piled on top leads to more power requirements which leads to bigger batteries. And same issue. Lightweight AR only is going to be a matter of scaling back on processing overhead and battery requirements. Far back.

And so that "wait for AR" design team request is more so a 'dog ate my homework' and/or 'sibling rivalry' excuse. Delay product X because Product Y isn't coming for many years. Delay the MBA because the Mac Pro isn't ready , or vice versa doesn't make much sense at all.
 

SDAVE

macrumors 68040
Jun 16, 2007
3,578
601
Nowhere
I think the 'beef' factor is overblown.

Apple don't have any beef with AMD (AFAIK) - it's just that ASi doesn't support PCIe GPUs. If Apple ever intended it to do so, they could have used the last few years to refine the software stack for it, by enabling support for TB eGPUs (which are essentially the same thing). The only Mac that could support a new AMD GPU at this point would be the 2019 Mac Pro, and they hardly want to prolong its lifespan / raise the bar for its successor.

Apple probably wouldn't deal with Nvidia because a) as the market leader, they won't give Apple the volume discounts AMD will, and b) they won't agree / can't be trusted to keep CUDA off macOS, which would be unwanted competition for Metal. And as ASi doesn't support PCIe GPUs, it's moot now anyway.

I know a lot about the AMD/NVIDIA GPU situation than most since I've actually got a response from Jensen (NVIDIA CEO) and AMD GPU team back in the day.

Apple has AMD engineers (or had) in house to build the AMD drivers, but most likely got rid of them now that it doesn't use AMD for any of it's GPUs.

For NVIDIA, MacBook GPUs dying was one of the major issues why Apple severed ties with NVIDIA. Another issue was that NVIDIA was not willing to make custom GPUs for Apple and Steve has been quoted as saying they don't need any of the closed source tech like CUDA.

But as we know, NVIDIA has been the superior GPU for a very long time now. I think Mac Pros will benefit from NVIDIA and AMD, but not the other model Macs.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
And hefty fraction of folks grumbling about RNDA3 only really want to buy an off the shelf (non Apple MPX ) product to side step paying for drivers. Is Apple really going to be in a big hurry to rush into that situation? Probably not.
The difference in price is also due to the fact that MPX models are custom designs with small production runs, whereas off-the-shelf PC GPUs have large economies of scale / competition between vendors.

A major contributor to Apple's adoption of using a new architecture family of AMD (or Intel's or Nvidias) was adopting a major fraction of the family into new Mac products. (not one solitary product... multiple ones. ). Apple dropping AMD from the many millions of Macs sold is going an impact. ( the Mac Pro all by itself isn't anywhere near 'unit millions' at all. )
Good point. With all other Mac models now using ASi, the entire macOS driver development costs for new AMD GPUs would need to be borne by the relatively small number of sales to MP users - a significant ‘Apple tax’.
 
Last edited:
  • Like
Reactions: maikerukun

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
NVIDIA was not willing to make custom GPUs
Were AMD? Even if so, this was presumably only necessary for laptops, and assuming that Nvidia GPUs weren’t suitable already. No reason to ‘ban‘ Nvidia GPUs from Mac Pros as well. Perhaps it’s only cost-effective to write drivers if they are also being used in volume products like laptops? Though Nvidia seemed prepared to provide web drivers for Mac users.
 
Last edited:

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
Errr. from the article referenced.

" .. Apple Glasses are further ahead, and little is known about these as yet. We’re unlikely to see these before 2025, if then. ..."

The article isn't about "Apple Glasses".




Apple's Goggles focus isn't primarily on VR. The first referenced article mixed/AR/VR solution. Not really the glasses.


The more cameras have to lean on to provide reality the less likely you are going to play them into being practical inside the the standard frame size of normal glasses. Also the more fancy 'gee-whiz' electronics piled on top leads to more power requirements which leads to bigger batteries. And same issue. Lightweight AR only is going to be a matter of scaling back on processing overhead and battery requirements. Far back.

And so that "wait for AR" design team request is more so a 'dog ate my homework' and/or 'sibling rivalry' excuse. Delay product X because Product Y isn't coming for many years. Delay the MBA because the Mac Pro isn't ready , or vice versa doesn't make much sense at all.

Cool.

In 2018, Apple purchased a startup that builds AR glasses


Here is a demo that was delivered at the SIGGRAPH conference by Stanford's AR/VR dept. illustrating pancake lense tech, which will be featured in the glasses. Rumor has it apple has partnered with Stanford around this.

Feel free to start a thread on this subject and continue if you are so interested in it. I honestly could care less, but just wanted to share a few things, since Zombie brought it all up above.

I really didn't want to throw this thread off topic though, and I think we should get back to talking about the Mac Pro...
 

SDAVE

macrumors 68040
Jun 16, 2007
3,578
601
Nowhere
Were AMD? Even if so, this was presumably only necessary for laptops, and assuming that Nvidia GPUs weren’t suitable already. No reason to ‘ban‘ Nvidia GPUs from Mac Pros as well. Perhaps it’s only cost-effective to write drivers if they are also being used in volume products like laptops? Though Nvidia seemed prepared to provide web drivers for Mac users.

AMD was. They were willing to go above and beyond with Apple as a customer, NVIDIA wasn't.

The Web Drivers were a hack job and so were the macOS CUDA drivers. I had random issues all the time when I used them (Green glitches, etc) Apple has moved on to Metal since then and I am not sure how monumental the job would be for NVIDIA to write drivers for Apple.
 

SDAVE

macrumors 68040
Jun 16, 2007
3,578
601
Nowhere
Because AMD needed whatever customer. Nvidia didn't need Apple.
AMD is just a better company in that regard, they make custom SoCs for Consoles like PS5/XBox. NVIDIA just does it's own thing and is still the leader in the GPU market.

Like I mentioned, Apple had an AMD team internally that would write and optimize AMD drivers for macOS (this was confirmed to me by someone that works at Apple). That would never happen with NVIDIA as NVIDIA is uncooperative. Even their GPU for the Nintendo Switch was a disaster.

Not to mention that NVIDIA has too many proprietary technologies like DLSS, CUDA, etc. AMD has similar technologies but they are open source (NVIDIA still beats them out tho)
 

PineappleCake

Suspended
Feb 18, 2023
96
252
Even their GPU for the Nintendo Switch was a disaster.
please tell how it was diaster and if you say because it was weak, that falls on Nintendo. As I say later on, Nvidia was very cooperative to help make an API. I don't get why you denote weak with "disaster". (Going by your "method", the M1 Ultra GPU is a disaster because its weaker than an RTX 4090)

The Tegra GPU was not powerful and its just old by todays standards. It was Nintendo's choice to use the Tegra chip which had Maxwell arch which is 2014 tech. Nintendo used this to save money.

The fact the Switch is still usable at all in 2023 with that GPU tech is thanks to optimization of the Switch API which is considered to the most efficient graphics API.
Nvidia and Nintendo worked together to make this API, its called "nvn". Look its Nvidia working together with a company. "nvn" is what Nintendo exclusive games use and this API is also used by third-party devs as well.

Expect the next Nintendo system to based on 2020 Ampere GPU tech which has DLSS and a much faster GPU, thats what the rumours say.


It's like Apple still selling the 2012 Mac Pro in 2019, it was Apple's choice to sell old tech. Mind you thats even worse than Nintendo using 2014 tech in a 2016 product cause the Mac Pro 2013 was freaking $3K.



Like I mentioned, Apple had an AMD team internally that would write and optimize AMD drivers for macOS (this was confirmed to me by someone that works at Apple). That would never happen with NVIDIA as NVIDIA is uncooperative. Even their GPU for the Nintendo Switch was a disaster.

Not to mention that NVIDIA has too many proprietary technologies like DLSS, CUDA, etc. AMD has similar technologies but they are open source (NVIDIA still beats them out tho)
What about Apple? Apple whole hardware and OS is proprietary at least Nvidia supports Linux and Windows. It's good to have options. DLSS and CUDA are better than AMD's counterparts.
AMD is just a better company in that regard, they make custom SoCs for Consoles like PS5/XBox. NVIDIA just does it's own thing and is still the leader in the GPU market.
They do because PS/xbox want x86, where as Nintendo wanted an ARM SoC because of battery life and their whole Switch is also a handheld and AMD does not make ARM SoCs.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Like I mentioned, Apple had an AMD team internally that would write and optimize AMD drivers for macOS (this was confirmed to me by someone that works at Apple). That would never happen with NVIDIA as NVIDIA is uncooperative.
I've heard a lot of reasons why Apple and Nvidia stopped working together. This is definitely one of them. AMD put their Radeon driver engineers on site at Apple. A lot of people think (mistakenly) Apple writes the Radeon drivers. They don't. But the AMD engineers are on site.

I've also heard a bit about custom GPUs. The rumor I heard was that AMD had a deal with Apple where Apple could fab their own Radeon GPUs. So they could do things like fab the same GPU on a smaller process. Nvidia would not give them the same deal without extra conditions Apple didn't find acceptable.
 

PineappleCake

Suspended
Feb 18, 2023
96
252
I've heard a lot of reasons why Apple and Nvidia stopped working together. This is definitely one of them. AMD put their Radeon driver engineers on site at Apple. A lot of people think (mistakenly) Apple writes the Radeon drivers. They don't. But the AMD engineers are on site.
This is so easy to solve if Apple opens up. Nvidia writes drivers for Linux and Windows from their own labs but they can't for Apple because of Apple's "requirements" and macOS being fully closed.
 

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
(Going by your "method", the M1 Ultra GPU is a disaster because its weaker than an RTX 4090)
The M1 Ultra GPU IS a disaster though, and actually it gets easily smoked by a freakin' 3 year old RDNA2 card. I'm seriously tired of the Mac Studio fanclub, and also equally as tired of proving this.

In fact, you honestly shouldn't even put the 4090 and M1 in the same sentence, because that's like comparing a Geo Metro with a Lamborghini Diablo.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.