Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

toke lahti

macrumors 68040
Original poster
Apr 23, 2007
3,293
509
Helsinki, Finland
As mentioned in a certain video, which brings a tincture of gaming in AS macs, no support for dGPU == no more eGPUs?
Really?
How sure we can be of this?

"Hardly no iOS users are using eGPU with their devices."
 

Joe The Dragon

macrumors 65816
Jul 26, 2006
1,031
524
So no high end pro system with multi head out? or will all mac pros come with build cores locked to max video out at the cost of cpu cores?
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
I would like to point out that "ARM Macs not having dedicated GPUs" is still just a rumor, and is based on shaky logic.

Lets apply some critical thinking here:
1. Integrated GPUs exist for current Apple processors because there's no reason to fit a dedicated GPU on an iPhone or iPad. For small devices such as them it makes sense to have the whole system on a single chip.
2. Making a HEDT-class processor with a HEDT-class GPU would be prohibitively expensive and lower the performance ceiling of both the CPU and GPU. Not really an option with the "Pro"-level machines. And it would be much easier technically and financially to have a dedicated die for a separate Apple CPU and GPU
3. Dedicated GPUs are used in higher-end "Pro" Apple products such as the 27-inch iMac, the iMac Pro, the 16-inch MacBook Pro, and the Mac Pro. Unless Apple's willing to compromise the performance of all these products, a dedicated GPU is necessary.
 

cmChimera

macrumors 601
Feb 12, 2010
4,308
3,844
It seems unlikely to me that Apple would do away with dGPUs at this juncture. It's possible that they start making their own dGPUs though. But I think the first 16" Macbook Pro with Apple silicon will continue to have an AMD GPU.
 

JohnnyGo

macrumors 6502a
Sep 9, 2009
957
620
To simplify production and allow for upgrades I can see Apple doing away with dedicated GPUs on the entire Mac line.

I would never, however, envision Macs without eGPU support given Thunderbolt 4 and the time Apple has spent making sure eGPUs are first class citizens. Mac Pros are a different beast as the MPX modules will allow for dedicated GPUs.

For their higher end models, my guess is that Apple might move the GPU out of the current SOC in a 2 chip layout, like Intel and AMD have done, but those are still considered iGPUs.
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,701
2. Making a HEDT-class processor with a HEDT-class GPU would be prohibitively expensive and lower the performance ceiling of both the CPU and GPU. Not really an option with the "Pro"-level machines. And it would be much easier technically and financially to have a dedicated die for a separate Apple CPU and GPU

You have to realize how inefficient Intel is on the low-volume HEDT chips. People are finding XCC die (28 core) on 12 and 16 core Xeon Ws. They're shipping chips with half the die unused.
 
  • Like
Reactions: tuvok86

awesomedeluxe

macrumors 6502
Jun 29, 2009
262
105
This youtubber's "evidence" that AS macs do not support dGPU / eGPU can of course be questioned?
Good watch! I think this one by the same guy might be what you meant to link, but they are both good videos.

But, all that was said was no support for AMD graphics - not no discrete graphics. In fact, he explicitly says "we should expect Apple to make some form of dedicated graphics chips." So dGPUs are far from dead - they're entering a very exciting era.
 

MikhailT

macrumors 601
Nov 12, 2007
4,583
1,327
Apple WWDC videos kept emphasizing unified memory model on their SoCs that reuse the same memory between their custom CPU and GPUs. They explain why it is a huge benefit for their SoC and how their GPU is different from traditional dGPUs. Why brag about it and then offer dGPUs?

I don't think we'll ever see dGPUs for consumer SKUs again.

As for Pro SKUs (iMac Pro / Mac Pro), we won't know much until year two but I think iMac Pro will go away and Mac Pro is the only one where they might support the dGPUs. My worry is that Apple is going to offer Afterburner-kind of cards for common workloads, not dGPUs.
 

UltimateSyn

macrumors 601
Mar 3, 2008
4,967
9,205
Massachusetts
I would like to point out that "ARM Macs not having dedicated GPUs" is still just a rumor, and is based on shaky logic.
Simply not true.

"And to know if a GPU needs to be treated as integrated or discrete, use the isLowPower API. Note that for Apple GPUs isLowPower returns False, which means that you should treat these GPUs in a similar way as discrete GPUs. This is because the performance characteristics of Apple GPUs are in line with discrete ones, not the integrated ones. Despite the property name though, Apple GPUs are also way, way more power-effficient than both integrated and discrete GPUs."

"Intel-based Macs contain a multi-core CPU and many have a discrete GPU ... Machines with a discrete GPU have separate memory for the CPU and GPU. Now, the new Apple Silicon Macs combine all these components into a single system on a chip, or SoC. Building everything into one chip gives the system a unified memory architecture. This means that the CPU and GPU are working over the same memory."
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,701
Apple WWDC videos kept emphasizing unified memory model on their SoCs that reuse the same memory between their custom CPU and GPUs. They explain why it is a huge benefit for their SoC and how their GPU is different from traditional dGPUs. Why brag about it and then offer dGPUs?

Marketing. Everything they said is true about the crappiest Intel integrated graphics today. Unified memory: check. No PCI-E bus overhead: check. Zero-copy: check.
 
  • Like
Reactions: adib

throAU

macrumors G3
Feb 13, 2012
9,198
7,346
Perth, Western Australia
I suspect eGPU will not be explicitly killed off, but it will rely on the GPU vendor to write an apple-silicon/ARM compatible driver.

Which neither Nvidia or AMD are likely to do unless there's a wholesale industry shift to ARM based designs (which MAY happen, but not immediately).
 

johngwheeler

macrumors 6502a
Dec 30, 2010
639
211
I come from a land down-under...
I would like to point out that "ARM Macs not having dedicated GPUs" is still just a rumor, and is based on shaky logic.

Lets apply some critical thinking here:
1. Integrated GPUs exist for current Apple processors because there's no reason to fit a dedicated GPU on an iPhone or iPad. For small devices such as them it makes sense to have the whole system on a single chip.
2. Making a HEDT-class processor with a HEDT-class GPU would be prohibitively expensive and lower the performance ceiling of both the CPU and GPU. Not really an option with the "Pro"-level machines. And it would be much easier technically and financially to have a dedicated die for a separate Apple CPU and GPU
3. Dedicated GPUs are used in higher-end "Pro" Apple products such as the 27-inch iMac, the iMac Pro, the 16-inch MacBook Pro, and the Mac Pro. Unless Apple's willing to compromise the performance of all these products, a dedicated GPU is necessary.

Have you seen the recent series of YouTube videos from Max Tech?




There is plenty to indicate in Apple's own developer presentations at WWDC that they are looking at integrated GPUs on the SoC. There's even a slide that has a table excluding NVidia & AMP GPUs in the "GPU column".

I'm not saying that they will *never* build a dGPU, and this may be necessary for the Mac Pro (something modular similar to the AfterBurner card)

But the first Apple Silicon Macs are very unlikely to have a dGPU as we have today. It's possible that in the MBP16 / iMac that the GPU is "on-package" but not included in the SoC, i.e. built into the chip, but from a separate silicon die.

Given that modern consoles (XBox X & PS5) both have integrated GPUs (with XBox only 10% behind an NVidia 2080 Ti at 12 TFlops), what is the techhical limitation of producing a fast on-Soc GPU? Happy to be educated!
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,701
I suspect eGPU will not be explicitly killed off, but it will rely on the GPU vendor to write an apple-silicon/ARM compatible driver.

Which neither Nvidia or AMD are likely to do unless there's a wholesale industry shift to ARM based designs (which MAY happen, but not immediately).

Nvidia already has Linux ARM drivers because they sell ARM chips (e.g. Nintendo Switch), and because there's supercomputers on ARM. They also have POWER drivers too.

AMD's open source Linux drivers also work on ARM, also since they sell ARM CPUs themselves as well.
 
  • Like
Reactions: Michael Adams

throAU

macrumors G3
Feb 13, 2012
9,198
7,346
Perth, Western Australia
Nvidia already has Linux ARM drivers because they sell ARM chips (e.g. Nintendo Switch), and because there's supercomputers on ARM. They also have POWER drivers too.

AMD's open source Linux drivers also work on ARM, also since they sell ARM CPUs themselves as well.

They have ARM drivers for the SOC in the switch, and possibly the enterprise stuff, but you can bet that they aren’t tuned for consumer GPUs and Vulkan, etc. - never mind Metal.

Linux != Mac.

I’m not saying it won’t happen of course. My point though is that thunderbolt (on ARM) is a thing, thunderbolt enclosures will be a thing - the limiting factor will be whether or not a macOS driver is available to drive the display.

That remains to be seen, but it would not surprise me if it goes either way at least in the short term. Long term i think there will be macOS ARM drivers for the cards, just maybe not on day 1.
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,701
They have ARM drivers for the SOC in the switch, and possibly the enterprise stuff, but you can bet that they aren’t tuned for consumer GPUs and Vulkan, etc. - never mind Metal.

Nvidia has a full "consumer" implementation of Vulkan thanks to the Shield. Their SOCs use the same architecture as the desktop cards.

Nvidia has been designing and shipping their own ARM SOCs longer than Apple has. It's not anything foreign or special to them.
 

AdamSeen

macrumors 6502
Jun 5, 2013
350
423
I like Max Tech, but I think he’s wrong.

Killing off dGPUs isn’t going to happen anytime soon. No matter the efficiency of an integrated Apple GPU. It will never have the thermal design power or chip space to compete with separate GPUs or offer a wider range of features such as larger memory capacities with HBM memory.

But performance will be increased a lot for those that currently rely on intel integrated GPU.

I’d guess we will see dGPUs offered in the lineups we have now. I.e. MBP 16 will have a separate GPU option.

As to whether there will be a separate Apple GPU. I’m less sure, but I would think it unlikely. It’s a lot of silicon fabbing to dedicate to something that’s only used in a minority of macs. Without shared CPU/GPU memory advantage, I’m not sure they’d be much better than AMDs offerings.
 
  • Like
Reactions: Jorbanead

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Have you seen the recent series of YouTube videos from Max Tech?
Yes I have, but I do think that it's wrong for the entire lineup.

There is plenty to indicate in Apple's own developer presentations at WWDC that they are looking at integrated GPUs on the SoC. There's even a slide that has a table excluding NVidia & AMP GPUs in the "GPU column".

I'm not saying that they will *never* build a dGPU, and this may be necessary for the Mac Pro (something modular similar to the AfterBurner card)
That's pretty much exactly what I'm getting at, we're in agreement here.

But the first Apple Silicon Macs are very unlikely to have a dGPU as we have today. It's possible that in the MBP16 / iMac that the GPU is "on-package" but not included in the SoC, i.e. built into the chip, but from a separate silicon die.
Looking back on my post I think I wasn't communicating well and it sounded rude, I'm sorry.
But that scenario is the most likely I think. A separate silicon die at the very least is my guess for their "pro" models.
Given that modern consoles (XBox X & PS5) both have integrated GPUs (with XBox only 10% behind an NVidia 2080 Ti at 12 TFlops), what is the techhical limitation of producing a fast on-Soc GPU? Happy to be educated!
As impressive as the new consoles are (and I don't mean that sarcastically, they are very impressive), I doubt Apple's "Pro" machines are going to have on-SoC GPUs. The ones in the lineup that currently have integrated graphics are likely to remain so.

We have to recognize that some workloads require more power than even an excellent iGPU can provide. In addition to that, there's plenty of measurable benefits to having a separate GPU as well. Just the fact that dedicated GPUs exist when integrated GPUs have been with us for nearly two decades supports this.

As for technical limitations, the number one thing that comes to my mind is heat. Having the CPU and GPU on separate dies allows heat to be spread over a larger area and means that a workload on one wouldn't slow down the other. Video encoding and neural processing come to mind.

The second technical limitation that I think we have to recognize is size of the SoC itself.
The A12z has a miniscule die size of 10.1mm by 12.6mm, and the fact that it performs so well and has as many features as it does in such a small package is mind-boggling for sure. That's a third of the size of say, the i7 9700K (at 37.5mm sq) and leaves a lot of growing room, since the A14 (and the assumed Mac-deritive) are going to be on the smaller 5nm node.

Now let's look at the latest high-end AMD desktop-class GPU, the 5700xt. It sits at about 251^2mm.
That's over ten times the size of the A12z as a whole. Even if we give Apple some heavy leeway and assume they can make an equivalent GPU in half the size, that still means the chip is gonna be ten times as large as it would otherwise be.

The Xbox One X even has a die size of its SoC as over 360^2mm, gargantuan even to Intel's heavyweight Xeon chips, which stand at 76.16mm x 56.6 (for the 8380HL).

So what does all this mean?

When one manufactures microchips, there's always defects that show up, and the larger the size of the chip, means the more likely as defect will show up on any given chip. So, if you make smaller chips, then you have to throw out less of them.

Following this logic, means that there's greater benefit to having a dedicated GPU chip and CPU chip on their "Pro" machines.

With that said, I am in complete agreement that the first Apple Silicon Macs will not have dedicated GPUs, but by the time they get to Mac Pro and MacBook Pro level machines I'd say it's more likely than not that they will have dedicated GPUs.
 

awesomedeluxe

macrumors 6502
Jun 29, 2009
262
105
But the first Apple Silicon Macs are very unlikely to have a dGPU as we have today. It's possible that in the MBP16 / iMac that the GPU is "on-package" but not included in the SoC, i.e. built into the chip, but from a separate silicon die.

Given that modern consoles (XBox X & PS5) both have integrated GPUs (with XBox only 10% behind an NVidia 2080 Ti at 12 TFlops), what is the techhical limitation of producing a fast on-Soc GPU? Happy to be educated!
A chiplet design would be acceptable. I think it might even be likely. The alternative is really sad.

If everything is a single die APU... well, that really, really sucks for the MBP 16 and the iMac Pro. So much wasted potential. They probably end up with only like 16GPU cores, with Apple declaring performance victory over the Radeon 5500M.

The Mac Pro is even worse off, though. Since in this timeline, it doesn't even exist.
 

konqerror

macrumors 68020
Dec 31, 2013
2,298
3,701
Given that modern consoles (XBox X & PS5) both have integrated GPUs (with XBox only 10% behind an NVidia 2080 Ti at 12 TFlops), what is the techhical limitation of producing a fast on-Soc GPU? Happy to be educated!

It generally isn't the GPU, it's the RAM bandwidth. The high-end consoles you refer to run their entire system off very power hungry and expensive graphics RAM. (The current generation also has very mediocre netbook-class CPUs) The One X has a system RAM bandwidth of 326 GB/s. Compare this to a full desktop-class system which has 40 GB/s.

Many systems (Intel Iris Pro, Xbox One S) have tried to get around this by caching but it hasn't been that widespread, for whatever reason.
 

SecuritySteve

macrumors 6502a
Jul 6, 2017
949
1,082
California
Having an SoC makes sense for the following mac models:

Mac Mini
MacBook Air
MacBook (if it is continued)

It does not make sense for:

MacBook Pro
iMac
iMac Pro
Mac Pro

Note that the iMac is a non-pro consumer machine that greatly benefits from a discrete GPU. Many consumers care more about the GPU power of a machine than the CPU power of a machine - both in the Mac and non-Mac worlds. If Apple's ARM implementation kills off discrete GPUs, they are not killing off a minority of it's users, they are killing off a majority.

Staying on topic, if Apple does not write drivers for Apple ARM systems using AMD / NVIDIA GPUs, there will be none. This is not a supposition, it is the current state of affairs for GPU compatibility on intel Macs. This is the reason that NVIDIA cards do not work as eGPUs on macOS. NVIDIA would gladly supply web drivers if Apple would permit it - but Apple does not. If Apple also refuses to sign AMD drivers for ARM Macs, then they will not work in eGPUs either, regardless of thunderbolt 3 / 4 support.
 

toke lahti

macrumors 68040
Original poster
Apr 23, 2007
3,293
509
Helsinki, Finland
If AS has efficiency of 10x compared to intel igpu and 5x compared to amd/nvidia dgpu,
wouldn’t it trivial to make a soc to imac that has 100W tdp and 20W is for cpu and 80W for gpu, which means it’s eqvivalent to 400 watts of amd/nvidia gpu power?
 

MarkC426

macrumors 68040
May 14, 2008
3,693
2,096
UK
It seems unlikely to me that Apple would do away with dGPUs at this juncture. It's possible that they start making their own dGPUs though. But I think the first 16" Macbook Pro with Apple silicon will continue to have an AMD GPU.
Don’t they already make their own gpus.....all the MPX module gpu’s for Mac Pro seem to be bespoke.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.