Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
According to one of the slides, Apple is also dumping AMD for graphics.

There is no slide affirmatively saying that Apple is dumping AMD graphics.

There is also a slide saying that developers should still check for AMD graphics on ARM Macs. And there is more documentation beyond that talking about discrete GPUs on ARM Macs.

People have been running around with that slide drawing wild conclusions when I don’t think there is enough evidence either way yet. If Apple is dropping AMD then why should developers be checking for non Apple graphics on ARM. Who’s making these discrete GPUs that developers should be checking for? Why does the documentation still show eGPUs as supported? I don’t know the answers either but it seems way too early to make assumptions.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
According to one of the slides, Apple is also dumping AMD for graphics.

Apple doesn't said that in that slide. That is highly likely more so folks on the outside reading into that slide.

On the first iteration of Apple SoC ( A12Z -> "Mac-14-whatever" ) there is probably only an iGPU option. If that is mostly all that is going to exist until Feb-June 2021 there won't pragmatically be any opening for a AMD/Intel GPU to get into the picture. ( there was hype about an high end iMac on ARM coming "real soon" that has gone pretty quiet now. If the first mover is a bottom "half" laptop Macbook-MBA-MBP-two-port solution then it is Intel that is the primary , "dumped" GPU vendor. Intel is the #1 Mac GPU vendor.; not AMD.


Most likely that wasn't the WWDC "forever" slide. It was WWDC "right now" slide. WWDC 2021 they can put up another one that is more filled in.
[automerge]1594828163[/automerge]
I think AMD is now on notice, but if they can deliver with their next generation GPUs they’ll probably continue to be a part of the Mac.

I would guess Apple’s first desktop SOC’s will Mac out at around 10 teraflops of GPU performance. If AMD can beat that Apple isn’t going to have a huge reason to do their own discrete GPUs.

If AMD doesn't come down on the perfromance per watt curve a bit it isn't just Apple that is a problem. AMD's rival Nvidia will be a relatively bigger problem.
AMD can get over the 10 TFLOPS hurdle. The 5700M is at around 8TFLOPS now.


Some optimizations and a bit more power and they could get past that with pretty close to tech they have now.
The bigger issue is what TDP budget going to get. AMD is definitely on notice of getting squeezed out of the MBP 16". If Apple can 'charge up the hill' to 10 TFLOPS then the 21"-24" iMac isn't on firm ground either.

[ Decent chance it isn't just TFLOPS though. If there there are 4 TB4 ports and need to support driving 5 (or more) 4K screens at a time, then I'm not so sure Apple is going to throw that kind of transistor budget at their iGPUs. ]


If Apple is enthusiastically supporting TB4 though, then the TDP budget inside the enclosure that Apple ships isn't the total story. However, if it got down to the point that only the 27" iMacs ( regular and Pro) and Mac Pro were the only possible place for AMD to get a 'win' then it wouldn't be just a unilateral Apple putting AMD on notice. That would probably pragmatically end up with notices traveling in both directions.


Apple's WWDC 2020 slides didn't mention TB4 but when TB4 was more clearly announced last week Apple was 'there'.
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
Which presentation is that one in? I didn’t see that, but I did watch a good chunk of them just because I was curious.

Around 14 minutes in "Bring your Metal app to Apple Silicon Macs" they talk about not relying on Apple GPU features being present on a GPU, and specifically checking to see if a GPU is an Apple Silicon GPU (self.appleGPUFeatures = metalDevice.supportsFamily(.apple5)).

There are a lot of things that could be implied here. It could be implied AMD is sticking around. It could imply that Apple is doing their own discrete GPUs so you shouldn't rely on specific behaviors. But it is Apple directly saying that don't assume because you are on ARM you rendering on an Apple SoC GPU.

They also show a low power vs high power GPU check which again, implies the existence of multiple GPUs. Apple SoC is always low power, so what's high power?
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,174
Stargate Command
Around 14 minutes in "Bring your Metal app to Apple Silicon Macs" they talk about not relying on Apple GPU features being present on a GPU, and specifically checking to see if a GPU is an Apple Silicon GPU (self.appleGPUFeatures = metalDevice.supportsFamily(.apple5)).

There are a lot of things that could be implied here. It could be implied AMD is sticking around. It could imply that Apple is doing their own discrete GPUs so you shouldn't rely on specific behaviors. But it is Apple directly saying that don't assume because you are on ARM you rendering on an Apple SoC GPU.

They also show a low power vs high power GPU check which again, implies the existence of multiple GPUs. Apple SoC is always low power, so what's high power?

I would say Apple SoC has been low power up until now, but who knows what the future holds for high-end (Mac Pro line-up) Apple Silicon?
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
they talk about not relying on Apple GPU features being present on a GPU
Found the video after I asked as it was one of the few I’d downloaded for additional viewing. :) When I viewed that section, it was more about how to properly detect Metal features on the Apple Silicon Macs. If you’d used GPU names previously, that’s no longer future proof.

As Apple GPU features will ALWAYS be available on Silicon Macs, that’s not really ‘checking for AMD’. EDIT: Wanted to clarify that it seems to mean “you won’t ever care if there’s an AMD GPU or not”.
Apple SoC is always low power, so what's high power?
Apple SoC returns “false” to metaldevice.isLowPower because, according to the video, “this is because the performance characteristics of Apple GPUs are in line with discrete ones, not the integrated ones”
 
Last edited:

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
As Apple GPU features will ALWAYS be available on Silicon Macs, that’s not really ‘checking for AMD’. EDIT: Wanted to clarify that it seems to mean “you won’t ever care if there’s an AMD GPU or not”.

That's not at all what they advised. They advised on ARM that you need to check the GPU feature set to see if it is an Apple GPU, and fallback to a traditional rendering path otherwise. It's left undefined when that might happen, but it's pretty clear that they're advising developers not to assume your rendering device is always an Apple SOC GPU.

They literally said the exact opposite of "Apple GPU features will ALWAYS be available". There might always be an Apple GPU present, sure. But not every GPU on the system is guaranteed to be an Apple SOC GPU.

In fact, the wording around that is even funny. The Apple GPU family is implied to only refer to Apple SOC GPUs. While there is still a "Mac" family available on ARM that is traditional and Apple SOC GPUs. It almost implies that Apple won't ship a GPU discrete product, even though discrete GPUs have a giant blank space left for them in the guidance and the API. Which implies GPUs from someone other than Apple.

Again, just lots of blank spaces that Apple declined to fill in beyond saying to remember to step over them. It may be Apple didn't feel like it was important to emphasize at all because people are still developing for AMD GPUs on Intel, and that code will just work on AMD GPUs on ARM. Why focus on it when the point is to get people excited for single address space and tiled rendering?

Apple SoC returns “false” to metaldevice.isLowPower because, according to the video, “this is because the performance characteristics of Apple GPUs are in line with discrete ones, not the integrated ones”

Ah, I got confused if discrete or isLowPower is what they changed. I guess the Apple GPU returns no for discrete but no for is low power.

Which still makes you wonder why Metal makes discrete GPUs a check on ARM platforms if discrete will never be an option. They certainly don't seem to be actively unsupporting discrete GPUs on ARM. Even if they're not going to start with discrete GPUs they're definitely leaving themselves the option.
 
Last edited:

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
Which still makes you wonder why Metal makes discrete GPUs a check on ARM platforms if discrete will never be an option.
I think that while the video is “Bring your Metal app to Apple Silicon Macs”, it’s providing information on how to do so that in a way that will not break your app when building for Intel (since universal apps will be a thing). Regarding “which GPU”, as far as Metal is concerned, it wants to know if you have an Apple GPU (appleGPUFeatures) and my assumption is it’ll be “true” for Apple Silicon (and iOS, iPadOS) and false for anything else. I’m assuming there’s more Metal features other than just those 3 provided in the video, guess I’ll go check those out just to see what they are.

Once we see the performance of Apple’s first gen Apple Silicon graphics, it should be clear if they ever intend to utilize discrete GPU’s on ARM.

heh, cue lawsuit from Silicon Graphics LOL
 
  • Haha
Reactions: Boil

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
I think that while the video is “Bring your Metal app to Apple Silicon Macs”, it’s providing information on how to do so that in a way that will not break your app when building for Intel (since universal apps will be a thing). Regarding “which GPU”, as far as Metal is concerned, it wants to know if you have an Apple GPU (appleGPUFeatures) and my assumption is it’ll be “true” for Apple Silicon (and iOS, iPadOS) and false for anything else. I’m assuming there’s more Metal features other than just those 3 provided in the video, guess I’ll go check those out just to see what they are.

Again, the point kind of is that Apple told people to check the GPU family, not the host CPU type. They didn't say "If you're on ARM assume you have an Apple SOC." They said "always check to see if it's an Apple SOC."

We're kind of just going in circles, but I'm just saying that was their chance to say "If you're on ARM you can make assumption X" and that's not what they did.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
We're kind of just going in circles, but I'm just saying that was their chance to say "If you're on ARM you can make assumption X" and that's not what they did.
Pretty much, as long as there are blanks in what’s been communicated, it’s possible to speculate on AMD, NVIDIA, Intel, PowerVR, or any other GPU (or no discrete GPU). I’m pretty sure there are still folks speculating about an AMD CPU switch as Apple hasn’t explicitly said they’re NOT going to use AMD. :)

From 4:38...
“Let me highlight some of the major changes from the Intel-based Mac to Apple Silicon Mac when it comes to the GPU. Apple Silicon Mac contains an Apple designed GPU, whereas Intel-based Macs contain GPU’s from Intel, AMD, and NVIDIA.”
 

goMac

macrumors 604
Apr 15, 2004
7,663
1,694
From 4:38...
“Let me highlight some of the major changes from the Intel-based Mac to Apple Silicon Mac when it comes to the GPU. Apple Silicon Mac contains an Apple designed GPU, whereas Intel-based Macs contain GPU’s from Intel, AMD, and NVIDIA.”

So again, there are two large gaps with this quote:
- Apple doesn't say an Apple Silicon Mac will _only_ work with an Apple GPU
- They also tell developers to check to see if a GPU is an Apple GPU before they use it

Again, maybe they do end up with every GPU being an Apple GPU. But they're leaving themselves a lot of room to do whatever they want. When they show demo code on how to check to see if a GPU is an Apple GPU when you're on Apple Silicon, that's a big hint that you might run into a non-Apple GPU on Apple silicon. Apple hasn't closed that door at all.
 
  • Like
Reactions: Nugget

iindigo

macrumors 6502a
Jul 22, 2002
772
43
San Francisco, CA
At the very minimum, I would expect any GPUs that work as of the last Intel macOS release to work and continue to work on ARM Macs for many years to come, even in they're hooked up as eGPUs. I don't think there's a whole lot of architecture-dependent code in GPU drivers, so as long as supporting them under AARM is only a little more involved than ticking a box, why shut them out?

Of course that doesn't say anything for new GPUs, but if current gen and next gen Navi works on ARM Macs that'll probably be enough to tide a lot of people over until Apple has GPUs powerful enough to replace them, especially for ARM Macs that support TB4 (which will fix the bottlenecking issues that limit current eGPU usefulness).
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
But they're leaving themselves a lot of room to do whatever they want.
Oh, they’ll likely never produce any documentation that says 100% one way or the other. Even if they redo the entire line and they’re all Apple GPU’s that STILL doesn’t say that an AMD GPU is NOT potentially in the future. The only time we can know about AMD GPU’s for 100% certain is, say in 20 years, when they’re switching to the next thing, if in all that time, they only produced Apple Silicon systems with Apple GPU’s.
 

Unregistered 4U

macrumors G4
Jul 22, 2002
10,612
8,636
but who knows what the future holds for high-end (Mac Pro line-up) Apple Silicon?
From Apple’s video, “Despite the property name though, Apple GPU’s are also way, way more power efficient than both integrated and discrete GPU’s.“ So, as far as what developers are being told right now, the Apple Silicon GPU’s are expected to be more power efficient than even current integrated GPU’s.

For anyone wanting another view into what Apple currently considers a “Silicon Mac” versus an Intel-Mac, check out “Explore the new system architecture of Apple Silicon Macs” starting at 40 seconds.
https://developer.apple.com/wwdc20/10686
 

ThunderSkunk

macrumors 601
Dec 31, 2007
4,075
4,562
Milwaukee Area
I would say that five years after last sale is about as long as you can expect any sort of support to persist. If you figure that Apple will take about a year or so to replace the Macintel Pro with an ARMed Mac Pro -- say, mid 2021 -- then you can expect Macintel products to be supported until about mid 2026.

As with all macs, the hardware outlives the software. The 17" MBP I'm writing this on is 11 years old, & shipped with Snow Leopard. I recently reinstalled SL to reminisce, and discovered how much of the OS no longer functions. Of course Apple only supports their own software for a handful of years. Anything more would be crazy. Luckily, long after OS X self-destructs and parts of it cease to function, your intel mac still has a decade of use left in it, since Windows runs a Mac perfectly fine. Thanks, Microsoft, for making our Macs worth the money, I guess.
 

zephonic

macrumors 65816
Feb 7, 2011
1,314
709
greater L.A. area
As with all macs, the hardware outlives the software. The 17" MBP I'm writing this on is 11 years old, & shipped with Snow Leopard. I recently reinstalled SL to reminisce, and discovered how much of the OS no longer functions. Of course Apple only supports their own software for a handful of years. Anything more would be crazy. Luckily, long after OS X self-destructs and parts of it cease to function, your intel mac still has a decade of use left in it, since Windows runs a Mac perfectly fine. Thanks, Microsoft, for making our Macs worth the money, I guess.

This. Times. Thousand.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Thunderbolt 4 still only has 32Gbps available bandwidth from four PCIe lanes (v3).

No really. Thunderbolt sets the minimal to 32Gbps. TB v3 would certify someone with an implementation with just a x2 PCIe v3 feed as being "complete and full compliance". That wasn't what the v3 controllers could actually do. That was the 'hurdle' height had to jump over to get officially certified.

The Ice Lake chips mobile instances integrated into the CPU package already potentially do over 32Gbps if aggregate over separate chains in a full 4 port implementation.

Blueprint%20Series_May%2016-2019_COMBINED%20FINAL_AnandTech%20%282%29-page-041.jpg

[ not necessarily gettin x16 PCI-e and DisplayPort out of backhaul fabric. But not necessarily limited to just x4 ( 32Gb/s). ) ]


The upcoming Tiger Lake probably doesn't back off from the "above" minimums allocations at all.


None of that "minimums uplift" really has much to do with Apple's usage. There was an early 4 port MBP 13" that Apple gimped one of the TB controller feeds, but they have generally done the full TBv4 class minimum bandwidth allocation all along. ( same with the vt-d protections over recent years . the DP 4K streams , etc. ). The bulk of the 'new' TB v4 minimums is what Apple as already doing. So not going to be a big leap for them at all to get to TB v4 if stick with Intel TB controllers. ( if they try to jump ship to someone else in next 12-15 months then I'd be skeptical that is actually in compliance. 3td parties are likely aiming at 'shorter' , easier to get over USB 4 'minimums' and perhaps TB v3 minimums ) [ There was a reason that Intel set the TB v3 bar lower... more than a few system vendors wanted lower bars. Same reason USB 4 has so many 'loopholes' on the harder to do higher speed options. ]

TB v4 is about evening out of exceptions. That the two ports on the left should be expected to do what the two ports on the right can do. Either provide the full x4 PCI-e provision allocation or out of the certification. (can slap a USB 4 on it that is looser/weaker. )
 
Last edited:

haralds

macrumors 68030
Jan 3, 2014
2,993
1,259
Silicon Valley, CA
This thread should really be entitled [...] switch to Apple Silicon based processors [...]

Apple does not use off-the-shelf ARM cores. They license the instruction set and build custom SoCs that far extend the functionality with multiple cores geared for different optimizations (power/performance,) integrated GPUs, neural engines, caches, peripheral management and WHATEVER ELSE THEY NEED. We are talking about deep integration with end use in mind. There is nothing general purpose about what they do.

Rene Ritchie captures this really well here.

This is also the reason why macOS Big Sur is 11.0 after OS X 20 year run. It is the takeoff point for new sets of functionality and performance not possible on another platform.

It is also the reason why only Apple virtualization will work on these machines. Major parts of the hardware interface will be private. The hardware architecture is not like a standard PC and likely the beginning of the end for the whole Windows eco system. Lenovo et al will have to make deep changes to compete. Windows is not a huge contributor to Microsoft's profit and I am not sure they will invest to lead in this area. Apps drive the desktop profits and MS will be happy to support whatever does well in that area. I would look towards Linux to start supporting more radical architecture changes in Mac competition.

Apple's hypervisor will emulate a standard hardware interface to hosted OSs and perhaps support future Linux evolutions.

On a Mac Pro style machine, there will likely still be support for certain std buses like PCIE. Hardware will likely require custom drivers, if they do not fall into a general class of devices or supported peripheral chips.

For this reason I suspect that the current Mac Pro is a dead end. The Mac Pro of the future will likely ship in a year or two and be a completely different animal than what we know now. It will share a lot of technologies with mobile, but in a more powerful expression adding modular hardware add ons including non Apple GPUs for special purpose uses. It will look like the current machine, but have very different guts.

Apple has been planning this for years. Dropping x86 Universals last year to make space for x64/ARM ones was the start of the count down.

Once we are down this road we will never look back, unless we are serious Windows users only part-timing on macOS. But as I mentioned above that universe will also be forced to change.

I should note that professionally I manage a team doing cross platform development for Windows, macOS, Android, and iOS. Most are using Macs. This will not be convenient, unless Microsoft gets on board and supports the new Hypervisor and fully supports Visual Studio for cross development from ARM to x64. Even with that we will need Wintel hardware for testing in R&D. Our QA group runs testing on native devices.

I have also been babying my beloved 5,1 just recently adding an RX-580 with flashed EFI. I am running Catalina full time and brought up Big Sur. I have VMware and Parallels with many VMs running comfortable on the dual Xeons with 48GB or RAM.

But when they come out with a good Apple Silicon MacBook Pro later this year, I will be on board. A fresh wind is blowing! The Mac Pro will likely be retired like my old 2,1 still sitting in the loft.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Apple has been planning this for years. Dropping x86 Universals last year to make space for x64/ARM ones was the start of the count down.

Once we are down this road we will never look back, unless we are serious Windows users only part-timing on macOS. But as I mentioned above that universe will also be forced to change.
I agree with this. I think the entire industry is gonna watch closely what happens with Apple Silicon Macs, and if they're as successful as Apple intends, then I think the industry itself is gonna start to shift. It's hard to see right now because we're 15-years into total x86 dominance, but at this point I think the writing's on the wall.

It's not just Apple, I'm sure other CPU and SoC makers are gonna smell blood in the water too. I'd bet Qualcomm, Samsung, Google, Amazon, NVidia, etc. etc. are going to start going for desktop-class processors should Apple pull off the switch. We're just waiting for the race to start.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.