Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
This is so easy to solve if Apple opens up. Nvidia writes drivers for Linux and Windows from their own labs but they can't for Apple because of Apple's "requirements" and macOS being fully closed.
That's not how it works.

The problem is as Apple updates macOS and Metal, they want engineers in house to do that with. Open sourcing or allowing drivers from outside doesn't fix "we have a new graphics API thats not out yet and we want to have meetings about it internally before the first preview at WWDC."

Apple doesn't want to release Metal 3 and have AMD/Nvidia write the drivers afterwards. Apple wants the drivers written as Metal 3 is being designed. That means engineers on site.

Could everyone work in remote sealed rooms with pre-release hardware and software? Sure. But Apple's "we prefer everyone to work in the office" applies here as well.

Apple Silicon GPUs also implicitly solve this problem. Don't have to deal with bringing in outside engineers to help your write your new version of Metal when the GPU engineers are inside Apple.
 

PineappleCake

Suspended
Feb 18, 2023
96
252
The M1 Ultra GPU IS a disaster though, and actually it gets easily smoked by a freakin' 3 year old RDNA2 card. I'm seriously tired of the Mac Studio fanclub, and also equally as tired of proving this.

In fact, you honestly shouldn't even put the 4090 and M1 in the same sentence, because that's like comparing a Geo Metro with a Lamborghini Diablo.
Yeah true the M1 Ultra is a disaster. I should have worded it better. Here goes
I should have said the 4070 Ti is not a disaster just because it's weaker than the 4090. They both serve their markets. Nintendo chose that SoC(GPU)to save costs and hit the $300 price point.

I am not denying it's outdated now it really is there should update it soon.
------_----
I am not a Mac Studio fan. I am a MacBook Pro fan and always wanted a Mac Pro. But Apple makes that hard for me. :(
 

avro707

macrumors 68020
Dec 13, 2010
2,263
1,654
The M1 Ultra GPU IS a disaster though, and actually it gets easily smoked by a freakin' 3 year old RDNA2 card. I'm seriously tired of the Mac Studio fanclub, and also equally as tired of proving this.

In fact, you honestly shouldn't even put the 4090 and M1 in the same sentence, because that's like comparing a Geo Metro with a Lamborghini Diablo.

The best thing that could happen is the Studio is cancelled and the next Mac Pro remains massively expandable but with a lower entry level price point to try and appease those who are so upset about the 7,1 high prices.

And for heavens sake open up to Nvidia again.
 

mattspace

macrumors 68040
Jun 5, 2013
3,344
2,975
Australia
The best thing that could happen is the Studio is cancelled and the next Mac Pro remains massively expandable but with a lower entry level price point to try and appease those who are so upset about the 7,1 high prices.

And for heavens sake open up to Nvidia again.

But the Studio is just the big iMac, with the screen separated. That's all it is. Its coexistence with the Mac Pro is just as stable as the big iMac's was. Some things it will be faster at, just like the big iMac was.

Maybe Apple listening to the chorus of "stop making us buy a new screen when we want to buy a new computer" will be indicative of a trend.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
In fact, you honestly shouldn't even put the 4090 and M1 in the same sentence, because that's like comparing a Geo Metro with a Lamborghini Diablo.
It's even worse than that. The studio is designed like a Lamborghini Diablo, is marketed and sold as if it's a Lamborghini Diablo, but has a Geo Metro engine.

Meanwhile car people would rather have a Geo Metro with a 6.0L V12 than a toothless Lambo. :D

Computer people are much the same. Just give us performance, Apple!
 

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
The M1 Ultra GPU IS a disaster though, and actually it gets easily smoked by a freakin' 3 year old RDNA2 card. I'm seriously tired of the Mac Studio fanclub, and also equally as tired of proving this.

In fact, you honestly shouldn't even put the 4090 and M1 in the same sentence, because that's like comparing a Geo Metro with a Lamborghini Diablo.
Ehh, I beg to disagree. It depends on the tasks you are looking to achieve. For 3D rendering yes, but for many other tasks no, not at all. I have a suite of different machines at work for different tasks. We are heavy users of Lightroom and Photoshop as well as Keyshot for 3D rendering. For Keyshot it's no comparison, but that's because Luxion still doesn't support Metal, so it's not really a true comparison since they don't use the GPU.

As a great example of the power of the Studio Ultra, I have done some benchmarking of the recent release of AI Denoise in Lightroom, which uses the Nvidia tensor cores and Apple's neural engines--the M1 Ultra perfectly matches the performance of an Nvidia 4080 RTX. It actually outperforms the RTX using Topaz DeNoise AI, which also uses the tensor cores and neural engines. BTW, the Ultra also crushes the 6800X Duo in my 2019 Mac Pro on these tasks.

Way too much focus is on 3D performance and synthetic benchmarks; GPUs and neural engines/tensor cores have much greater uses than that, and in real-world use the performance delta isn't what some make it out to be.
 
  • Like
Reactions: aytan and AdamBuker

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
Ehh, I beg to disagree. It depends on the tasks you are looking to achieve. For 3D rendering yes, but for many other tasks no, not at all. I have a suite of different machines at work for different tasks. We are heavy users of Lightroom and Photoshop as well as Keyshot for 3D rendering. For Keyshot it's no comparison, but that's because Luxion still doesn't support Metal, so it's not really a true comparison since they don't use the GPU.

As a great example of the power of the Studio Ultra, I have done some benchmarking of the recent release of AI Denoise in Lightroom, which uses the Nvidia tensor cores and Apple's neural engines--the M1 Ultra perfectly matches the performance of an Nvidia 4080 RTX. It actually outperforms the RTX using Topaz DeNoise AI, which also uses the tensor cores and neural engines. BTW, the Ultra also crushes the 6800X Duo in my 2019 Mac Pro on these tasks.

Way too much focus is on 3D performance and synthetic benchmarks; GPUs and neural engines/tensor cores have much greater uses than that, and in real-world use the performance delta isn't what some make it out to be.

We agree on synthetic benchmarks. We disagree on 3D. If you work in 3D it is that big a deal. Heck, people were running games with frame rates way better on a 5,1 with a single 6800xt. IF AR/VR become a 'thing' with the goggles/glasses, then it will be a further big deal. But yea, if you dont do 3D stuff, the delta/gap is smaller and often falls in favor of the studio.
 

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
We agree on synthetic benchmarks. We disagree on 3D. If you work in 3D it is that big a deal. Heck, people were running games with frame rates way better on a 5,1 with a single 6800xt. IF AR/VR become a 'thing' with the goggles/glasses, then it will be a further big deal. But yea, if you dont do 3D stuff, the delta/gap is smaller and often falls in favor of the studio.
I don't think we disagree on 3D at all; that is where it (the Studio, and Apple Silicon in general) falls short. My point is that's not a huge market overall and Apple clearly didn't position the Studio as the product for that space. That's supposed to be the Mac Pro...which is why there are folks here twiddling their thumbs wondering what Apple is up to. In the meantime if you have to have that performance, you basically have to do what I've done--get an annoying Windows PC just to run those apps.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
One would assume Apple wants to demo a 3D app that is highly optimized for Apple silicon & Metal when they debut the ASi Mac Pro, hence all the work on a "Full Metal" Blender...?

IMHO, it would make sense to debut the ASi Mac Pro with the most advanced Apple silicon possible, so M3 Ultra / M3 Extreme SoCs built on a 3nm process with A17-derived cores & hardware ray-tracing...
 

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
One would assume Apple wants to demo a 3D app that is highly optimized for Apple silicon & Metal when they debut the ASi Mac Pro, hence all the work on a "Full Metal" Blender...?

IMHO, it would make sense to debut the ASi Mac Pro with the most advanced Apple silicon possible, so M3 Ultra / M3 Extreme SoCs built on a 3nm process with A17-derived cores & hardware ray-tracing...
I agree, and IMO that's the reason for the delay. The M2-based ones the rumors are based on are probably just prototype/design verification pieces and the "real" one will be M3-based to squelch the chatter that they can't make a "Pro" with Apple silicon. The other new part numbers are the updated M2 Max and M2 Ultra Studios. I guess we'll find out soon enough.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
I agree, and IMO that's the reason for the delay. The M2-based ones the rumors are based on are probably just prototype/design verification pieces and the "real" one will be M3-based to squelch the chatter that they can't make a "Pro" with Apple silicon. The other new part numbers are the updated M2 Max and M2 Ultra Studios. I guess we'll find out soon enough.

Regarding the product strings...

An M3-based ASi Mac Pro would most likely have a Mac15,x product string...?
 

Adult80HD

macrumors 6502a
Nov 19, 2019
701
837
Regarding the product strings...

An M3-based ASi Mac Pro would most likely have a Mac15,x product string...?
Assuming the 14 is for the M2 generation then yes. But IMO it would be an M2 iMac, and two new Mac Studios. Again, just guessing but I feel like Apple knows there's no way they can release an M2-based Mac Pro unless there's something else *really* special about it.
 
  • Like
Reactions: Boil

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
But then there is this...

Highly doubtful Apple would put any Mn Max into a Mac Pro, because of slot support...?

But it does bring up the fact that a product string does not always mean there will be a shipping product...?

So maybe the Mac14,8 is the M2 Ultra Mac Pro, but it was deemed not performant enough to represent the top-of-the-line Apple silicon Mac product...?

I am hoping WWDC 2023 comes around and we see Mac15,1 & Mac15,2, aka the M3 Ultra Mac Pro & the M3 Extreme Mac Pro...!

Even better would be two more Mac Pro products, the Mac15,3 & the Mac15,4; these would be ASi Mac Pros with asymmetrical SoC configurations, pairing "regular" M3 Max SoCs with "GPU-specific" SoCs; lower CPU core counts, but higher GPU core counts...

And the existence of "GPU-specific" SoCs could be the building blocks for ASi (GP)GPUs...?

Now if Apple had some way to tie the main system SoC to one or more ASi (GP)GPUs, maintaining the whole UMA thing & appearing to the OS as a single entity; increase both RAM capacity & GPU power by adding in a few ASi (GP)GPUs...!
 

SDAVE

macrumors 68040
Jun 16, 2007
3,578
601
Nowhere
please tell how it was diaster and if you say because it was weak, that falls on Nintendo. As I say later on, Nvidia was very cooperative to help make an API. I don't get why you denote weak with "disaster". (Going by your "method", the M1 Ultra GPU is a disaster because its weaker than an RTX 4090)

The console was hacked due to NVIDIA's horrible SoC. Look it up. This caused massive piracy issues for Nintendo.
Also they were known to overheat, almost close to having a recall program.

What about Apple? Apple whole hardware and OS is proprietary at least Nvidia supports Linux and Windows. It's good to have options. DLSS and CUDA are better than AMD's counterparts.
Apple uses open source and closed source software—as much as people hate on Apple, it has done a lot for the industry in terms of standardization. It supported OpenCL and OpenGL. It had to move to Metal because it's more direct access to the GPU. Apple makes the whole computer, NVIDIA just sells parts, so it makes sense for Apple to do proprietary systems as a whole compared to NVIDIA. NVIDIA also doesn't care about 3rd parties, look at what they did to EVGA recently. It's a selfish corporation (but which corpo isn't selfish anyway?). So overall Apple had issues with NVIDIA and they went to AMD for a while, who treated them better.

They do because PS/xbox want x86, where as Nintendo wanted an ARM SoC because of battery life and their whole Switch is also a handheld and AMD does not make ARM SoCs.

Has nothing to do with that. AMD can do ARM SoCs if necessary. My point was that AMD works with partners to make them custom hardware, while NVIDIA is less friendly compared to AMD.

Steam Deck is a good example, which uses AMD. Switch is the only NVIDIA based console and Nintendo will dump NVIDIA in the next run.

I've also heard a bit about custom GPUs. The rumor I heard was that AMD had a deal with Apple where Apple could fab their own Radeon GPUs. So they could do things like fab the same GPU on a smaller process. Nvidia would not give them the same deal without extra conditions Apple didn't find acceptable.

That makes sense. NVIDIA had to go to Samsung for fab and use a higher nm process because AMD and Apple are top tier clients for TSMC, the best chip manufacturer in the world. Maybe they shared resources.
 

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
Screenshot 2023-04-25 141025.png


SOURCE

🤣🤣🤣
 
  • Like
Reactions: maikerukun

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
AMD was. They were willing to go above and beyond with Apple as a customer, NVIDIA wasn't.

I've also heard a bit about custom GPUs. The rumor I heard was that AMD had a deal with Apple where Apple could fab their own Radeon GPUs. So they could do things like fab the same GPU on a smaller process.

Can either of you point to any GPU that AMD (even partially) custom designed for Apple?

Apple fabbing AMD GPUs (via TSMC or equivalent) on a smaller process than AMD itself sounds very unlikely. For a start, I don’t think it’s as simple as just making the same design on a smaller process node - it needs to be reworked first.
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,173
Stargate Command
Apple fabbing AMD GPUs (via TSMC or equivalent) on a smaller process than AMD itself sounds very unlikely. For a start, I don’t think it’s as simple as just making the same design on a smaller process node - it needs to be reworked first.

Nah bruv, you just scale it down in MS Paint...
 
  • Like
Reactions: mode11

PineappleCake

Suspended
Feb 18, 2023
96
252
The console was hacked due to NVIDIA's horrible SoC. Look it up. This caused massive piracy issues for Nintendo.
Also they were known to overheat, almost close to having a recall program.


Apple uses open source and closed source software—as much as people hate on Apple, it has done a lot for the industry in terms of standardization. It supported OpenCL and OpenGL. It had to move to Metal because it's more direct access to the GPU. Apple makes the whole computer, NVIDIA just sells parts, so it makes sense for Apple to do proprietary systems as a whole compared to NVIDIA. NVIDIA also doesn't care about 3rd parties, look at what they did to EVGA recently. It's a selfish corporation (but which corpo isn't selfish anyway?). So overall Apple had issues with NVIDIA and they went to AMD for a while, who treated them better.



Has nothing to do with that. AMD can do ARM SoCs if necessary. My point was that AMD works with partners to make them custom hardware, while NVIDIA is less friendly compared to AMD.

Steam Deck is a good example, which uses AMD. Switch is the only NVIDIA based console and Nintendo will dump NVIDIA in the next run.



That makes sense. NVIDIA had to go to Samsung for fab and use a higher nm process because AMD and Apple are top tier clients for TSMC, the best chip manufacturer in the world. Maybe they shared resources.
I am not going to argue but I do believe the next Nintendo console will use Nvidia tech. Thats all.

IN 2023 nvidia has better GPU tech than AMD. Lovelace, RTX 40 is better than RDNA 3 and DLSS is just better too.
 
  • Like
Reactions: maikerukun

jmho

macrumors 6502a
Jun 11, 2021
502
996
I don't think we disagree on 3D at all; [...]. My point is that's not a huge market overall
I think on one hand you're correct, that in specifically for the users that Apple targets, powerful GPUs are somewhat of a niche requirement - but nVidia are the 7th most valuable company in the world, dwarfing Intel and AMD put together. In the wider world the market for GPUs is absolutely huge and is probably only going to get even bigger in the future as AI takes off and companies try to make VR happen.

It almost feels like a chicken and egg problem, where most Apple users don't typically require powerful GPUs, because Apple doesn't sell powerful GPUs (at least not at reasonable prices), so most users who need powerful GPUs aren't Apple users.
 

mode11

macrumors 65816
Jul 14, 2015
1,452
1,172
London
With a few exceptions, Apple computers have always had lukewarm GPUs. I think Apple just resents their power consumption, and consequent need for thicker enclosures.
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Apple fabbing AMD GPUs (via TSMC or equivalent) on a smaller process than AMD itself sounds very unlikely. For a start, I don’t think it’s as simple as just making the same design on a smaller process node - it needs to be reworked first.
Correct. And if one thought about the level of access you'd need to do something like that, you could understand why maybe Nvidia and Apple had trouble in discussions.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.