As mentioned in a certain video, which brings a tincture of gaming in AS macs, no support for dGPU == no more eGPUs?
Really?
How sure we can be of this?
"Hardly no iOS users are using eGPU with their devices."
I think the big mystery here is how Apple will (or won't) handle expandable graphics on the Mac Pro. It's highly possible that whatever they do there can be ported into a Thunderbolt 3/4/USB4 enclosure. For all we know, eGPUs for Intel Macs might still end up being compatible for Apple Silicon Macs (albeit with the latter utilizing them entirely differently). I'm not sure how likely it is, but as others have said, Apple put in a lot of work into eGPU support and having that effectively be dead-ended a year and a half after unveiling seems odd (especially since Apple would've been planning this Apple Silicon transition while releasing Intel Mac eGPU support).
I would like to point out that "ARM Macs not having dedicated GPUs" is still just a rumor, and is based on shaky logic.
Lets apply some critical thinking here:
1. Integrated GPUs exist for current Apple processors because there's no reason to fit a dedicated GPU on an iPhone or iPad. For small devices such as them it makes sense to have the whole system on a single chip.
2. Making a HEDT-class processor with a HEDT-class GPU would be prohibitively expensive and lower the performance ceiling of both the CPU and GPU. Not really an option with the "Pro"-level machines. And it would be much easier technically and financially to have a dedicated die for a separate Apple CPU and GPU
3. Dedicated GPUs are used in higher-end "Pro" Apple products such as the 27-inch iMac, the iMac Pro, the 16-inch MacBook Pro, and the Mac Pro. Unless Apple's willing to compromise the performance of all these products, a dedicated GPU is necessary.
No, Apple pretty much outright said that all SoCs will have an integrated GPU. While that doesn't necessarily preclude dGPU presence on the higher end, the way they said it heavily implies that we're likely not getting (at least traditional) dGPUs. We might see Afterburner-esque cards that add specialty co-processors to assist in graphical tasks that the Apple Silicon SoC integrated GPU calls for augmentation for (like ray tracing or real-time render assistance, just to throw out off-the-top-of-my-head examples). But traditional dGPUs are not coming along for the ride here. Luckily, it's seeming like they really won't need to so long as developers actually optimize their apps for Metal.
I like Max Tech, but I think he’s wrong.
Killing off dGPUs isn’t going to happen anytime soon. No matter the efficiency of an integrated Apple GPU. It will never have the thermal design power or chip space to compete with separate GPUs or offer a wider range of features such as larger memory capacities with HBM memory.
But performance will be increased a lot for those that currently rely on intel integrated GPU.
I’d guess we will see dGPUs offered in the lineups we have now. I.e. MBP 16 will have a separate GPU option.
As to whether there will be a separate Apple GPU. I’m less sure, but I would think it unlikely. It’s a lot of silicon fabbing to dedicate to something that’s only used in a minority of macs. Without shared CPU/GPU memory advantage, I’m not sure they’d be much better than AMDs offerings.
Comparing the AMD dGPU in the Intel 16" MacBook Pro and the iGPU in the Apple Silicon SoC that's likely to appear in the Apple Silicon 16" MacBook Pro replacement is an Apples to Oranges comparison. Those GPUs do not process or render graphics in the same way at all and there's enough WWDC2020 videos/articles that detail that if you don't believe me. These are much more efficient iGPUs than Intel's ever were and they're going to be able to do the kind of work you'd see on an AMD dGPU based Mac. For those not optimizing for these new GPUs, yes, it will appear as though AMD's offerings are better; the trick is to optimize for Metal. But if developers can do that (and the important ones will), then it'll still be an upgrade as far as the end result.
Don’t they already make their own gpus.....all the MPX module gpu’s for Mac Pro seem to be bespoke.
Apple made the cards; AMD still made the GPUs. Apple did modify the reference designs heavily; but it's still an AMD GPU under the hood.
I posted my thoughts about this a few weeks back. Just going to copy it here:
I felt reasonably certain that the recency of Apple’s initial rollout of eGPU support was an indication that it would not be phased out just two years later, and became even more confident with Apple’s re-iterated support for Thunderbolt v. USB4 earlier this week.
There have also been a number of recent articles about Apple’s ongoing development of VR/AR headsets.
https://www.theverge.com/2020/6/19/...-external-hub-jony-ive-bloomberg-go-read-this
Maybe there will be an additional widget to provide the necessary GPU horsepower, maybe not, but Apple obviously sees VR/AR as a big item on the horizon of computing.
Fair to say that not very many people are using VR headsets today, and most of those who are, are doing it for gaming. How much Mac-native VR is going on? Mac VR gaming? Not a lot...pretty niche.
Now go look at the eGPU Apple
support page.
https://support.apple.com/en-us/HT208544
First sentence of the article mentions VR. Interesting for such a niche use case to feature so prominently. And what are all the wonderful things an eGPU is good for with an Apple computer?
(1) making applications run faster. A very mainstream reason to use an eGPU.
(2) Adding additional monitors and display. Another super common reason to use an eGPU.
(3) Use a virtual reality headset.
...
I think the eGPU page is a dead giveaway Apple has something planned for VR, and an acknowledgement that they need a way to provide the necessary graphical capability to macbook and imac purchasers that don’t have that internal GPU capability.
Thank you for coming to my TED talk.
I agree that Apple would've have put the work into eGPU support (concurrent to Apple Silicon Mac transition planning) only for it to ONLY be applicable to Intel Macs. However, I think there's a ton about this that simply isn't clear yet (and won't be clear until Apple releases their first Apple Silicon Macs and the slew of documentation to follow them).
in a pro workstation one big CPU / GPU chip is bad all around
1st people who really need a lot of cpu power will be stuck paying for GPU power to drive 4 screens at 8K
or maybe people really need to drive a lot of screens / GPU power will not get it due to heat / chip / ram limits. maybe being forced to use USB based video docks.
I'm not sure what data you have to make such a claim. Plus, it's likely that they will expand the Afterburner family to include cards that handle all sorts of extended/expanded video processing and post-processing effects in the same way that the T2 has alleviated much of that from Apple GPUs. Hell, Apple has on the new 27" iMac, a variant of Intel processors that don't come with Intel IGPs at all (because all of the things that the Intel IGP actually did stellar work with separate from the AMD GPU is even better performed by the T2). It's very possible that we'll see additional graphics assistant co-processor offerings from Apple to offset whatever they don't put into a workstation chip. That being said, there's a ton that we don't know about whatever iPad-Pro-topping power that they'll even put into this new family of SoCs designed specifically for the Mac. Apple wouldn't be doing this transition if they were only capable of having it result in better lower-end Macs (Mac mini, MacBook Air, 13" MacBook Pro, 21.5" iMac).
I think this is right, and that folks are vastly overestimating how easy it will be for Apple to reach the performance found in the top available GPUs in iMac, Macbook Pro, Mac Pro. Do we really think Apple is going to develop an in-house Radeon VII equivalent to sell 10,000 units per year in a future Mac Pro refresh?
It also aligns with Apple’s recent support for eGPUs, which was only rolled out 2 years ago. Apple are going to sell the super skinny computers they’ve always wanted, and the eGPU will be just one more classic Apple dongle-based solution.
We’ll find out in short order, I suppose.
Again, I agree with sentiments suggesting that Apple didn't roll out eGPU support only to limit it to three years' worth of systems before it and a year and a half's worth of systems that followed between said rollout and the final round of Intel based Macs (especially on the low-end). That said, I think you're underestimating what they'll be able to do with their GPUs and software optimized for their frameworks.
Apple has been the one writing drivers for AMD chipsets on the Mac for a long number of years now, and goes as far as to write the firmware as well.
That is the reason why Apple doesn’t use NVidia parts today. Nvidia wouldn’t allow Apple to to go as deep into these things.
Again, do you have sources for this? Would love to see them.
Hardware -> Apple can win here;
Marketing Budget -> Apple can win here;
Games (Developers) -> Get the first two right and this one should take care of itself.
That argument is flawed and evidence of it being flawed is present in the Apple TV's lack of gaming titles (considering it is certainly equipped with console-level graphics) as well as the Mac's current struggles to get ports to popular PC/XBox/PS4 titles. Hell, Catalina's Culling(tm) resulted in Mac gaming being effectively cut in half going into this transition, so if it wasn't already unappealing for developers to release AAA titles for the Mac, it's about to be made worse in moving away from the x86 architecture and the kinds of GPUs Intel Macs have had (despite the design of Apple Silicon GPUs being way more efficient). Game devs just aren't embracing the Mac or the Apple TV.
Well,
mini 2006-7 had igpu,
mini2009 had 9400M
mini2010 had 320M
mini2011 optional 6630M.
The 9400M and the 320M were both iGPUs that also doubled as the logic board's chipset. They shared VRAM with the system just like the Intel iGPUs did. The only difference is that, relative to Intel iGPUs and the lower-end dGPUs present in the 15" and 17" MacBook Pros of the time, they didn't suck as much.
As for the 6630M, yes, it technically was a dGPU. It was an incredibly weak one that was subject to frequent failure (much like most of the ATI Radeon HD 5xxx and AMD Radeon HD 6xxx/M series in Macs of that era).