Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AgentMcGeek

macrumors 6502
Jan 18, 2016
374
305
London, UK
Yes and no. It is a link inside the SoC. So technically yes. Is it useful for anything else but that specific SoC internal controller? Pragmatically not.

Thunderbolt doesn't "ride on " PCIe. It encapsulates the data and transports PCI-e protocol. It is a substantive transport in and of itself. ( it is not "external PCI-e". That is just an inaccurate connotation that gets regurgitated on these forums as if it as true; it isn't. ).
Sad thing is, with the impressive GPU power of the M series, I’m not sure the eGPU market will survive. It was already niche when you could only have Intel Iris on 13’ models, but now…
 

CWallace

macrumors G5
Aug 17, 2007
12,528
11,545
Seattle, WA
Will this (M1x) get us back to being able to use 3 or 4 external 5K displays do you think?

I believe we will.

We know from the leaked schematics that the MBP will have three USB/TB ports so that means we will see at least three TB controllers (as Apple's current TB controller supports one port) plus there will be an HDMI port which can support 5K (@30Hz for HDMI 2.0 and @85/100Hz for HDMI 2.1).
 
  • Love
Reactions: macsplusmacs

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Sad thing is, with the impressive GPU power of the M series, I’m not sure the eGPU market will survive. It was already niche when you could only have Intel Iris on 13’ models, but now…

The Iris Pro use case was more likely the niche . I know there were articles like this one :


The DaVinci Resolve is a primary benchmark there. In the smaller sense yes, the Windows PC workstation/gamer laptops with "bigger GPU than Apple will touch" configurations. That will actually lower. Or the "come home and dock to game" market. Yes that will sag.

But the Blackmagic eGPU wasn't aimed at the "come home and game" crowd at all. It wasn't priced that way at all. Apple's "marketing" page about the device.

"...added graphics power for pro app workflows, gaming, VR creation, and more. ... "
https://support.apple.com/en-us/HT208897

driving VR could subside with the gaming, but the "and more" AI/ML training . etc. that stuff has grown. Apple's FCP is still a bit lame on GPU scale.


However, I think this is more indicative of a higher fraction of the use cases ( dual , or more , GPUs )


A substantive part of the market though was folks who had one decent GPU ( not impoverished Iris ) and wanted to get to two. That isn't driven by "gaming" at all. It is based on getting computational workload done.

Is Apple going to beat triple 3090 scores with their iGPU ? Probably not.


Or in the macOS space. ( baseline here is an iMac Pro with Vega 64 .... intel Iris no where in sight. )


The other aspect is that even if Apple's GPU gets within the same zip code of a 3080Ti - 3090 ... what is the balance going to be in 2-3 years against a 4090 or 5090? Competing over years. ( and yes the Blackmagic only eGPU runs into same early obsolesce as a compute device but Sonnet Tech's 500-600W boxes with just slots won't. The latter were the dominate share of sales for "eGPU" solutions. )

The other issue is that external Thunderbolt PCI-e slot boxes aren't going anywhere on Windows. Apple isn't the primary driver now. They will be even less so as long as they fumble and bumble on driver support on the macOS M-series side. But it is quite doubtful they are going to implode on the macOS Intel side. More than a few folks will slide a RX6800 or RX6900 into their box once the crypto + chip shortage eases that huge detachment from MSRP prices.

IMHO, eGPU across all the macOS instances will likely tread water. Some folks leaving for a single , better Apple GPU. Other folks extending life on macOS Intel with leading available AMD RDNA2 , RDNA3 cards. ( Nvidia ones also for those who are dual booting into Windows to get a substantive fraction of work done. There are folks with Nvidia GPU(s) in their Mac Pro 2019 so bigger GPGPU horsepower out of macOS mode. )
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I believe we will.

We know from the leaked schematics that the MBP will have three USB/TB ports so that means we will see at least three TB controllers (as Apple's current TB controller supports one port) plus there will be an HDMI port which can support 5K (@30Hz for HDMI 2.0 and @85/100Hz for HDMI 2.1).

the number of Thunderbolt ports and HDMI don't mean all of those ports have access to independent DisplayPort output streams.

The M1 Mini is stuck with a lower limit on two TB ports it has.

MPX cards have 4 TB ports and one HDMI port . But one of those TB pairs will loose a video stream if hook up to the HDMI port. Mac Pro 2013 ... similar issue; use HDMI port and loose stream out of a TB port.

What matter is if there are DisplayPort switching being used to feed the ports and how many feeds go into the switching. That's is one reason why Apple's systems don't qualify for TB4. They are short on DP streams.

The GPU of the M1-bigger-die is bigger though. So there is a decent chance Apple will add another display controller logic (or two for max sized one ).
 

Joelist

macrumors 6502
Jan 28, 2014
463
373
Illinois
Kinda makes me wonder if stuff like Stockfish is just hitting the GPU cores or if it is going to the ML block (which is where you would expect stuff like it does to go). In other words, so programs like Stockfish actually take advantage of M1 being an SOC?
 
  • Like
Reactions: Appletoni

GubbyMan

macrumors 6502
Apr 3, 2011
448
2,095
Stockfish is just a regular chess engine that works on the CPU. It's not a neural network nor does it have any SIMD processing as far as I know. Of course we now have AlphaZero and MuZero that are superior to Stockfish but those are not available to anyone outside DeepMind yet.
 
  • Like
Reactions: Appletoni

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
I would nearly bet the next chip will be a new architecture.

The M1 feels like the A14x.

Arm publicly released V9 in April not sure how relevant that actually is for Apple.

But how knows…
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
I would nearly bet the next chip will be a new architecture.

The M1 feels like the A14x.

Arm publicly released V9 in April not sure how relevant that actually is for Apple.

But how knows…

Most of ARM v9 is optional v8 stuff that Apple has already covered. The first version of ARM v9 is largely a bunch of things that just are not optional anymore. Another chunk is dumping 32-bit (for some target cores but not all). Apple put that in the rearview mirror a couple of iterations ago. M1 doesn't have it.

The A15 is more likely to have iterated on this front. Likely a second, follow on M2 ( same A14X sized die a the M1 is). Kind of hard for the M1-large die to be "old" and due for retirement when Apple hasn't even shipped one of these yet. The M2 ( for bottom "half" of Mac line up ) probably will stay coupled to the iPad Pro over the long term for volume cost ( economies of scale) reasons.

Bigger die ... it is not necessarily true that they have moved on the foundation core micro architecture implementations. Timing on the MBP 16" could have been thrown off buy several other components necessary to make it.

There is some 'new' vector stuff in Arm v9 but it is pretty doubtful that Apple went for the "super wide" option there. So not seeing the big upside. Apple already has a "chunky" AMX matrix component they add to their P cores. And they want to add lots of P and GPU cores. Bigger P and GPU cores isn't really going to help alot if going for a higher count. They are already going to suck up much more die area cranking up the count , adding memory controllers to keep those additional cores feed, and working out a bigger "interconnect" to glue it all together.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,523
19,680
There is some 'new' vector stuff in Arm v9 but it is pretty doubtful that Apple went for the "super wide" option there. So not seeing the big upside. Apple already has a "chunky" AMX matrix component they add to their P cores. And they want to add lots of P and GPU cores. Bigger P and GPU cores isn't really going to help alot if going for a higher count. They are already going to suck up much more die area cranking up the count , adding memory controllers to keep those additional cores feed, and working out a bigger "interconnect" to glue it all together.

SVE offers a future proof SIMD programming model, much better vectorization opportunities, gather/scatter, masked operations and so on. It would be a big update for any general-purpose code that utilizes SIMD capabilities of the CPU (and they are everywhere today, from hash tables to UTF8 processing or JSON parsing). Apple already offers wide SIMD design with 4x128bit ALUs, and I can see them adding another one or two in the near future. They don’t even to use wider SIMD, 128bit is enough.
 

Appletoni

Suspended
Mar 26, 2021
443
177
Kinda makes me wonder if stuff like Stockfish is just hitting the GPU cores or if it is going to the ML block (which is where you would expect stuff like it does to go). In other words, so programs like Stockfish actually take advantage of M1 being an SOC?
You mean LC0.
 

Appletoni

Suspended
Mar 26, 2021
443
177
SVE offers a future proof SIMD programming model, much better vectorization opportunities, gather/scatter, masked operations and so on. It would be a big update for any general-purpose code that utilizes SIMD capabilities of the CPU (and they are everywhere today, from hash tables to UTF8 processing or JSON parsing). Apple already offers wide SIMD design with 4x128bit ALUs, and I can see them adding another one or two in the near future. They don’t even to use wider SIMD, 128bit is enough.
For some people 4x 128bit is enough.
For other people 4x 265bit is enough.
For most people 4x 512bit will be enough.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
For some people 4x 128bit is enough.
For other people 4x 265bit is enough.
For most people 4x 512bit will be enough.
While it may not be apparent why, vector length is an architectural subject where ”longer” doesn’t necessarily translate into ”better”. Even for vector code, it’s a game of trade-offs. And for general purpose SoC:s where you also have a GPU, an NPU, and in Apples case specialised AMX instructions that can all offload the CPU for some types of problems AND you have to keep a very keen eye on power draw - well suffice to say that the balancing act is beyond even rather sophisticated forum warriors.
 

neilw

macrumors 6502
Aug 4, 2003
459
930
New Jersey
Another rumour for an August launch. Not sure they how they will justify an M1X with a one year old core architecture.
If the new chip yields a computer that is significantly faster and with longer battery life than the old one, what more justification do they need? I don't believe we have any reason to believe that the M2 family is going to represent some giant leap over the M1 family that suddenly makes all the M1 machines seem obsolete.

Similarly, I had hoped the new 24" iMacs were going to be equipped with an M1X... but when you come right down to it, the M1 iMac is way faster and quieter than the Intel iMac it replaced, so what else really matters?
 

leman

macrumors Core
Oct 14, 2008
19,523
19,680
I don't believe we have any reason to believe that the M2 family is going to represent some giant leap over the M1 family that suddenly makes all the M1 machines seem obsolete.

The next generation of Apple Silicon could bring SVE/SVE2, faster memory as well as hardware raytracing support. Not to mention improved display and peripheral support. I think the improvements over M1 are going to be substantial indeed.

Then again, it would be silly for Apple to release a pro machine based using the same architecture as M1 if the next generation is indeed much better. The signs point to prosumer hardware being based on the next-gen architecture, not M1.
 

CWallace

macrumors G5
Aug 17, 2007
12,528
11,545
Seattle, WA
If the new chip yields a computer that is significantly faster and with longer battery life than the old one, what more justification do they need? I don't believe we have any reason to believe that the M2 family is going to represent some giant leap over the M1 family that suddenly makes all the M1 machines seem obsolete.

We also need to remember Apple is not going to be spending billions on marketing campaigns with dozens of vendors crowing about how much better the next generation of M SoCs is over the previous like Intel does every time they release a new generation of Core CPUs.

So the significant majority of Mac customers likely won't even know what generation of SoC is in the various Macs on the table in front of them. What they will know (or be informed) is how the battery life is and what types of workflows each is optimized for. So they won't know a 2021 MBA has an M2 and a 2021 14" MBP has an M1X, but they will know the MBP can handle more complex workflows and has (probably) slightly lower battery life than the MBA and costs some dollars more.

And those that do know the generation (i.e. - us) will also know how they truly compare to each other and will know that an M1X is much more powerful than an M2 in what areas and offers more expansion capability and such.
 
  • Like
Reactions: neilw

Appletoni

Suspended
Mar 26, 2021
443
177
The next generation of Apple Silicon could bring SVE/SVE2, faster memory as well as hardware raytracing support. Not to mention improved display and peripheral support. I think the improvements over M1 are going to be substantial indeed.

Then again, it would be silly for Apple to release a pro machine based using the same architecture as M1 if the next generation is indeed much better. The signs point to prosumer hardware being based on the next-gen architecture, not M1.
Raytracing cores sounds good:)
And Tensor cores?
 

leman

macrumors Core
Oct 14, 2008
19,523
19,680
Raytracing cores sounds good:)
And Tensor cores?

"Tensor cores" and "raytracing cores" are Nvidia's marketing names. Apple can't ship them by definition since they are not shipping Nvidia products.

Apple has been offering an equivalent of "tensor cores" for years now — the Neural Engine and the AMX coprocessor. Both these technologies exist in M1. As to raytracing, Apple is licensing technology from Imagination and it is yet completely unclear in which shape it will be presented to the customers. One thing is sure: Apple has been investing a lot into raytracing recently, so they probably have some interesting technology to show off.
 

leman

macrumors Core
Oct 14, 2008
19,523
19,680
I am definitely hoping for that, but I don't think it's a tragedy if it's not.

To be honest, I do. If the pro-level Macs come out with without any new significant hardware features, that's a bad sign for anyone expecting Apple to innovate.
 

CWallace

macrumors G5
Aug 17, 2007
12,528
11,545
Seattle, WA
Considering how early we are in the "M Era" and how much better the first-generation of M was to most of it's 9th and 10th Generation of Intel peers, I am not worried about M1X vs. M2 for the Late 2021 / Early 2022 Macs.

If it's 2025 and M is still effectively that year's iPhone SoC with more cores and RAM, then I will start to worry about "innovation". ;)
 

leman

macrumors Core
Oct 14, 2008
19,523
19,680
Considering how early we are in the "M Era" and how much better the first-generation of M was to most of it's 9th and 10th Generation of Intel peers, I am not worried about M1X vs. M2 for the Late 2021 / Early 2022 Macs.

What I am worried about is Apple Silicon momentum. Take raytracing for example. If the upcoming Macs support hardware raytracing with performance that is competitive with Nvidia's high-end solutions (which is not impossible!), that would generate a lot of interest from visual professionals and gamers. If this feature never comes or comes late, then Apple is again just playing catch up. Similar consideration goes for SVE support. It requires support for new machine instructions, so it is critical to introduce it *now*, when Apple Silicon is still a fairly new target for professional and scientific software.
 

macsplusmacs

macrumors 68030
Nov 23, 2014
2,763
13,275
What I am worried about is Apple Silicon momentum. Take raytracing for example. If the upcoming Macs support hardware raytracing with performance that is competitive with Nvidia's high-end solutions (which is not impossible!), that would generate a lot of interest from visual professionals and gamers. If this feature never comes or comes late, then Apple is again just playing catch up. Similar consideration goes for SVE support. It requires support for new machine instructions, so it is critical to introduce it *now*, when Apple Silicon is still a fairly new target for professional and scientific software.

and since "real time ray tracing" is becoming the rage in some AAA games, that would be another reason to support ray tracing cores. not that apple will compete as a game machine, but it would be another checkmark required for those professionals to simply say "oh the m1XXX does that as well?" maybe i need to check these systems out.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.