Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AltecX

macrumors 6502a
Oct 28, 2016
550
1,391
Philly
Apple had TB for 5-7 years before most PCs from 2011. It was an Apple (+ Intel) co-technology.
We shall see this time....
(That was Apple with Steve/Jony. Now it's Tim). :oops:
PC's were getting Thunderbolt by end of 2011, and it started to show up in greater numbers by mid-2012. For the most part consumers just didn't care until V3 and V4 so there wasnt much driving force to add more than 1 port if that. Even in most Mac Environments I've supported the TB port usually went unused other than for video out untill it started to use the same port as USB.
 

DrWojtek

macrumors regular
Jul 27, 2023
187
401
Really looking forward to the M4 announcement. I think the GPU is bound for a nice upgrade. The M3 set the stage with the new abilities (dynamic caching etc), now the arch and cores themselves are due for an upgrade.

If M4 turns out to be just about AI then I could not be more disappointed. On the other hand I won’t feel bad buying the ’old’ M3 Mini while the M4 is already out, so that’s good.
 

diamond.g

macrumors G4
Mar 20, 2007
11,435
2,659
OBX
Really looking forward to the M4 announcement. I think the GPU is bound for a nice upgrade. The M3 set the stage with the new abilities (dynamic caching etc), now the arch and cores themselves are due for an upgrade.

If M4 turns out to be just about AI then I could not be more disappointed. On the other hand I won’t feel bad buying the ’old’ M3 Mini while the M4 is already out, so that’s good.
I thought the GPU cores in M3 were new?
 

DrWojtek

macrumors regular
Jul 27, 2023
187
401
They are new. Although I do hope just like @DrWojtek seems to, that M4 will have even newer GPU cores. I'd like to see concurrent FP32+FP32 and FP16+FP16 execution, as well as more formats for matrix multiplication.
It would be really nice to see the base M4 model be able to be fairly capable with some light gaming. I mean really capable, not just able to start Baldurs Gate on lowest settings or however the M3 fares.

If Apple are serious about gaming on the Mac, the Mini could a perfect entry point for people to try it. Especially if it supports hand controls.
 

precision01

macrumors regular
Oct 16, 2014
111
87
Apple can't afford not to be the ARM leader in CPU and GPU performance. That's why, they will probably update M series chips every 12 months for the next 2 or 3 years.
M1 Pro/Max was released in 2021.
M2 Pro/Max was ready to be launched on 2022.
M3 Pro/Max was released in 2023.
M4 Pro/Max will likely be released in 2024.
 
  • Like
Reactions: souko

precision01

macrumors regular
Oct 16, 2014
111
87
It would be really nice to see the base M4 model be able to be fairly capable with some light gaming. I mean really capable, not just able to start Baldurs Gate on lowest settings or however the M3 fares.

If Apple are serious about gaming on the Mac, the Mini could a perfect entry point for people to try it. Especially if it supports hand controls.
Check the M3 Max chip layout. There's a lot of possibilities for a game focused M4 chip: Less CPU cores... more GPU cores... Get rid of video encoders... less display engines ... cut some other stuff on the chip... put only 24Gb of Ram...
 

diamond.g

macrumors G4
Mar 20, 2007
11,435
2,659
OBX
Check the M3 Max chip layout. There's a lot of possibilities for a game focused M4 chip: Less CPU cores... more GPU cores... Get rid of video encoders... less display engines ... cut some other stuff on the chip... put only 24Gb of Ram...
Keep the encoders for OBS live-streaming...
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Apple can't afford not to be the ARM leader in CPU and GPU performance. That's why, they will probably update M series chips every 12 months for the next 2 or 3 years.
M1 Pro/Max was released in 2021.
M2 Pro/Max was ready to be launched on 2022.
M3 Pro/Max was released in 2023.
M4 Pro/Max will likely be released in 2024.

Didn't happen that way.


MBP M1 2021 ( 10/21 ) [there was almost a year delay after the plain 'M1' here. ]
MBP M2 2023 ( 01/23 , not 2022. Can make claim they wanted to in 2022 but they did not)
MBP M3 2023 ( 10/23 ) [ co-launched with the plain M3 which didn't happen before in previous two iterations ]


M3 knocked off M2 "prematurely" is just as likely to push M4 back as much as move it forward. M1-M3 have major security hole to plugged. The N3P has trade-offs to bigger dies ( which for Max has bigger negative impact). Relatively rapidily killing off the biggest die (that are the most expensive to produce ) isn't going to be affordable overt the long term.

There are no "hand me down" products for the Max die. (as opposed to Mn trickle down into iPad Air. or iPhone Ann into lowest end basic iPad. )
 

name99

macrumors 68020
Jun 21, 2004
2,407
2,309
Extreme niche product. Highly unlikely to ever happen, as it likely wouldn't be worth making.
You cannot argue with gamers. They are utterly convinced they are the center of the world, regardless of all evidence to the contrary.

You can REALLY see this in their responses to Blackwell; like they consider it some sort of personal insult that nVidia "unexpectedly" "let them down" by delivering something that's so uninteresting for gaming. Truly an insane level of delusion as to where nVidia's priorities lie (and have done so for the past few years).
 

DrWojtek

macrumors regular
Jul 27, 2023
187
401
You cannot argue with gamers. They are utterly convinced they are the center of the world, regardless of all evidence to the contrary.

You can REALLY see this in their responses to Blackwell; like they consider it some sort of personal insult that nVidia "unexpectedly" "let them down" by delivering something that's so uninteresting for gaming. Truly an insane level of delusion as to where nVidia's priorities lie (and have done so for the past few years).
I am not by any means a gamer, and I do not see Apple releasing a gaming-specific SoC.

What I am saying is, I occasionally play a game, and while I might not justify upgrading the work computer since it does my stuff just fine if the only thing I’m getting is more performance in that area, if I knew I would gain some FPS in my occasional game session, and gain performance for my work tasks as a bonus, that just might push me to upgrade.

And a whole bunch of people who do game would consider a Mac if they could game on it. But few gamers can afford a fully specced PC AND a mac, so they choose the PC.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
And a whole bunch of people who do game would consider a Mac if they could game on it.
I’m sure those three people will continue to be disappointed.

There’s extremely little overlap between people who play pc games and Mac buyers.
 
  • Haha
Reactions: diamond.g

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Check the M3 Max chip layout. There's a lot of possibilities for a game focused M4 chip: Less CPU cores... more GPU cores... Get rid of video encoders... less display engines ... cut some other stuff on the chip... put only 24Gb of Ram...
Keep the encoders for OBS live-streaming...
Extreme niche product. Highly unlikely to ever happen, as it likely wouldn't be worth making.

Combine the following:
  1. Mn CPU cores, video encode/decode engines, & display engines
  2. Mn Pro RAM capacity & UMA bandwidth
  3. Mn Max GPU cores
I give you the Mn Gaming SoC...!
 

T'hain Esh Kelch

macrumors 603
Aug 5, 2001
6,472
7,405
Denmark
Combine the following:
  1. Mn CPU cores, video encode/decode engines, & display engines
  2. Mn Pro RAM capacity & UMA bandwidth
  3. Mn Max GPU cores
I give you the Mn Gaming SoC...!
Sooo… Why not just use a Max? Price wise it won't be much cheaper since you are adding a whole new SKU for a niche group of people.
 
  • Like
Reactions: MRMSFC

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Really looking forward to the M4 announcement. I think the GPU is bound for a nice upgrade. The M3 set the stage with the new abilities (dynamic caching etc), now the arch and cores themselves are due for an upgrade.

M3 is a new architecture. Even Apple refers to it as a major "new GPU architecture". Not sure how they would do dynamic caching with absolutely zero collaboration between the cores and the cache ( as if the cache is completely detached from architecture anyway).

What Apple is a bit stale on is the memory subsystem ( still basic LPDDR5) , not the GPU arch. I would expect some tweak fixes and adjustments to higher raw RAM bandwidth access ( and updated on chip backhaul network. ). But it would be surprising to see a "let us toss that out and do a new yet another architecture" move on M4.

Different clocking and some change behind the GPU instruction decode and rebalanced cache to new bandwidths seems more likely. Very unlikely going to get a better general process node density (even if wait to sync up with N3P. ). If there is no node density increase then going to be kind of tough to add "more stuff" . The design aid tools may be better at dealing with N3-like contexts so could squeak out some incremental wins in some subsets of the logic design, but substantially large increases in transistor budget likely are not coming next iteration.

Pretty good chance this will be an "improve what you have" generation ( like M2 over M1 ).

Apple released the M1 Ultra before WWDC that year and had a substantive number of hiccups of developers being about to optimize for that GPU hardware. They are basically in the same boat now. Throwing out new hardware where the tools and the developer knowledge is lagging isn't going to help much try to trade "hype headline blows" with Nvidia (or AMD).

If M4 turns out to be just about AI then I could not be more disappointed.

Probably not. If uplift the raw RAM backhaul then all the different computational subcomponents should get more data. That should 'lift all boats'; not just a single anointed one. A bigger NPU would more easily share the chip network bandwidth with the GPU ( and other units. ).

And now more publicly obvious Apple has got some substantive security holes to fill. Would be a good idea to fix those in M4.

Might want to get leveled up on I/O to USB4,2 ( TBv5 on non plain SoCs. Still suspect the plain M4 won't meet TBv4 requirements ( or perhaps hand waves around them with the lid closed). )

If they are going to continue to charge $400/TB ...maybe read/write speeds competitive with PCI-e v4/v5 SSDs that will increasingly be standard in 2025-26 timeframe.

Apple is still slacking on AV1 encode.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Combine the following:
  1. Mn CPU cores, video encode/decode engines, & display engines
  2. Mn Pro RAM capacity & UMA bandwidth
  3. Mn Max GPU cores
I give you the Mn Gaming SoC...!

If you double the number of GPUs cores and chop the raw RAM bandwidth in half ... how is that going to help keep all those additional GPU cores 'fed' with data? It very likely won't work well on substantive GPU loads that don't primarily just live in cache 99% of the time. You can do lots of hand waving that M4 will have extra Oompa Loompa caching magic to offset the bandwidth deficit ... but Apple is already eye-ball deep in the 'bag of tricks' just by baselining on LPDDRx .

The bandwidth is deeply coupled to the RAM packages. You want to use less packages to drive the price down ( i.e. lower the capacity), but that also cuts into the available bandwidth also. Similar issue with trying to shrink the chlp smaller with just plain Mn CPU core counts. Smaller die means smaller edge space which means less memory channels (and lowering of total aggregate bandwidth ).

If saddle this Mn Max core count to the same number of RAM packages than a Mn Max has then the end user system costs is effectively going to end up relatively close to the same. ( Apple's $/GB RAM prices is going tip the balance to being a very substantial driver on price. )

The GPU core configuration for the Max will be tuned for the Max's bandwidth. Everything is integrated and coupled when it comes to tuning.
 

diamond.g

macrumors G4
Mar 20, 2007
11,435
2,659
OBX
If you double the number of GPUs cores and chop the raw RAM bandwidth in half ... how is that going to help keep all those additional GPU cores 'fed' with data? It very likely won't work well on substantive GPU loads that don't primarily just live in cache 99% of the time. You can do lots of hand waving that M4 will have extra Oompa Loompa caching magic to offset the bandwidth deficit ... but Apple is already eye-ball deep in the 'bag of tricks' just by baselining on LPDDRx .

The bandwidth is deeply coupled to the RAM packages. You want to use less packages to drive the price down ( i.e. lower the capacity), but that also cuts into the available bandwidth also. Similar issue with trying to shrink the chlp smaller with just plain Mn CPU core counts. Smaller die means smaller edge space which means less memory channels (and lowering of total aggregate bandwidth ).

If saddle this Mn Max core count to the same number of RAM packages than a Mn Max has then the end user system costs is effectively going to end up relatively close to the same. ( Apple's $/GB RAM prices is going tip the balance to being a very substantial driver on price. )

The GPU core configuration for the Max will be tuned for the Max's bandwidth. Everything is integrated and coupled when it comes to tuning.
They could up the size of the GPU SRAM to cover the LPDDR bandwidth deficit (basically what AMD does with Infinity Cache). Not price wise, it may not be a good tradeoff though.
 
  • Like
Reactions: UltimaKilo

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
They could up the size of the GPU SRAM to cover the LPDDR bandwidth deficit (basically what AMD does with Infinity Cache). Not price wise, it may not be a good tradeoff though.

SRAM is not scaling anymore (at least not for several more years. ) N3E goes BACKWARD from N3 to N5 sizes.
Pouring on SRAM like ketchup isn't a viable path right now.
 
Last edited:
  • Sad
Reactions: diamond.g

Chuckeee

macrumors 68040
Aug 18, 2023
3,060
8,722
Southern California
I’m fairly certain it is about as fixed as it is going to get on the M3 with a flag to turn off the DMP.
I doubt it will get a simple M3 fix, the M3 design is locked down and it won’t be modified for a security hole at this late stage in production. The bigger question is will it get fixed in the M4, how much of M4 layout hasn’t been locked down yet.

Another question is this flaw part of the Ax design and does it represent an iPhone vulnerability. I would imagine the iPhone would face much greater vulnerability to exposure for exploitation in the wild.
 
  • Like
Reactions: tenthousandthings

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
I doubt it will get a simple M3 fix, the M3 design is locked down and it won’t be modified for a security hole at this late stage in production. The bigger question is will it get fixed in the M4, how much of M4 layout hasn’t been locked down yet.

Another question is this flaw part of the Ax design and does it represent an iPhone vulnerability. I would imagine the iPhone would face much greater vulnerability to exposure for exploitation in the wild.
There is a programmable flag on the M3 that turns off the DMP. I doubt Apple is going to do much more about this vulnerability. Turning off the DMP is sufficient.
 
  • Like
Reactions: DrWojtek

leman

macrumors Core
Oct 14, 2008
19,517
19,664
I am not by any means a gamer, and I do not see Apple releasing a gaming-specific SoC.

Id argue that current Apple GPUs are more gaming–specific than not. They are certainly not geared for scientific computation or machine learning. Major features Apple has been introducing recently are either general–purpose performance or graphics–focused.

Pouring on SRAM like ketchup isn't a viable path right now.

It is if one splits compute and cache functionality between different dies. And we know Apple is exploring that path. They’ve had patents describing this since 2021.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.