Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sure, but that's just the cost of doing business. Bandwidth needs go up over time. The answer to that is relatively obvious, and it's not challenging for Apple to know when they need to deal with that. They've repeatedly shown (in the various M chips) that they're willing to spend resources to get this right.

Apple is skipping ECC primarily because they don't have enough bandwidth in their system design. So it is more so they have repeatedly shown they will do what is convenient for their immediate interests. Scaling at the top end ... that has not gone without hiccups.


If the internal network bandwidth is saturated it may have come down to a choice between adding a P-core cluster or an E-core cluster. [/quote ]
That seems highly unlikely. Like, 100% not likely. Area is a much more compelling argument.

Apple puts the P (and E ) core cluster on a bandwidth budget that is smaller than the aggregate memory bandwidth. A single P core can soak up the whole cluster allocation but once all four are going full blast they all have to share. There is not unlimited bandwidth from every point to point on the internal network. Very likely there is QoS constraints. So yes, where going to impose QoS problems the bandwidth allocations likely are saturated at some point.

A bigger , even faster internal bus with 'excess' headroom bandwidth consumes more power. More power is lower Pref/Watt. Pretty likely Apple has an internal bus that is 'fast enough' but not tons of excess slop overage in excess bandwidth.

The context here is that folks are just throwing 'extra' P/E clusters at the network because not happy enough with how high the core count is... higher .. more horsepower ... And there is probably a target budget for bandwidth/power.




Your argument (not quoted) about power efficiency is more interesting, and it's why I said I don't believe the (still plausible) 6E cluster idea. But I also doubt the 2E cluster idea. They moved phones from 2P2E to 2P4E, which I think says it all.

That doesn't say it all in the slightest. The move isn't to 2E it is to 6E. You are just doing misdirection. The baseline 4E from the phones you are trying to imply is missing somehow is going NOWHERE. It is 4+2 . The 4 is still there. It is just being augmented.

The phones don't have 14-16" screens soaking up power when doing mostly nothing. The laptops do. The 2E core only mode wouldn't be the most common, but the variety of contexts the laptop has to operate in is larger than that of a phone.











Your (also unquoted) argument that another 2E cores (going from 6 to 8) is too expensive in die area seems unlikely to me. I mean, I can easily see the engineers deciding "we just don't need more E cores", but saying "we don't have room for 8, but we have room for 6" seems very unlikely. The E cores are really small. And unless you redesign the shared cache and the rest of the cluster support, you're not saving any space on that.

It is not about the die space allocation for the E cores themselves, it is about total die space limitation. There is other stuff that is likely looking for more space. Apple has bigger than normal display controllers. They are trying to more more SMIC/PMIC stuff onto the die. The die is a ' everything and kitchen sink ' collection of stuff. The limitations on the M1 Pro/Max go 2 E cores shaved off. ( and got 'unshaved' when M2 bloated out to a bigger die area budget0. That is impossible to happen again? Probably not.


Current pricing suggests they see a lot of elasticity in Mac Pro pricing. If they had it working reasonably well, I think they'd have built it.

lots of elasticity? The Mac Pro top end price came significantly down ; not up. Costs coming down means you do NOT have major elasticity. People want more value for the dollar.

The other major problem that Apple has now is that to get to higher CPU cores you also have to buy more RAM and more GPU cores. Likewise. Max RAM ( have to buy more other stuff). Max GPU cores ( have to buy more other stuff). All that deep coupling is far more likely to drive people out of the price points they would like to pay. Hence, lower elasticity ( the farther you push people off their 'comfortable' price point the more moaning and groaning likely to get. The MP 2023 got GOBS of moaning and groaning over price. ). The Cupertino Kool-aid story that may be spin inside of Apple HQ is that among the folks left still paying for a new Mac Pro the elasticity is up. But that is tons of kool-aid drinking as sizable chunk of former customers walk away. The MP 2019 and 2023 is not selling at 2010 or 2013 era unit levels.

A M3 with more performance at the current prices will help add more value for the dollar. That will help with elasticity problem. Some eye-watering expensive Extreme M2 or M3 version won't. Even more so against the faction waving off-the-shelf windows 4090/7900XTX cards at them (versus the misdirection of Apple only counting the hyper priced W6900X they sold to a small few) . Or off-the-shelf DDR5 DIMMs.

Folks who need a balanced across CPU-GPU-RAM all to the same relative degree , Apple has decent traction with. The folks mainly focused on just one of those dimensions, they have major problems.
 
If Apple can get 20-30% without clock gains, then I’d say it’s even more likely we won’t see any significant clock gains, likely just staying in the 3.4 to 3.6 range again. But, hey that’s just me. I may be completely surprised, but one thing I won’t be is disappointed :)
Yes, for laptops, that would make a ton of sense. Take the efficiency win, while still racing ahead on performance.

For desktops, you can make a good case for driving clocks up a bunch. And maaaaybe for an MBP when not on battery, to the extent the cooling can handle it- which is not that well, for sustained load, but for bursty workloads, it's plausible.
 
  • Like
Reactions: Pinkyyy 💜🍎
Apple is skipping ECC primarily because they don't have enough bandwidth in their system design. So it is more so they have repeatedly shown they will do what is convenient for their immediate interests. Scaling at the top end ... that has not gone without hiccups.
How do you know that, about ECC?

You're certainly right about "without hiccups" or we'd have a better Mac Pro right now. Not sure how well that argument extends.

The rest of your post got lost in quoting errors. Please fix it so I can read it.
 
For desktops, you can make a good case for driving clocks up a bunch. And maaaaybe for an MBP when not on battery, to the extent the cooling can handle it- which is not that well, for sustained load, but for bursty workloads, it's plausible.
Oh I think there’s a case that could be made, I just don’t feel that Apple would care to listen. :) As long as mobile platforms remains their bread and butter, their money is going towards producing efficient, performant processors to go in those mobile systems. Then they’ll hold their moistened finger aloft to detect for the winds of desktop market and toss a bone in that general direction. The bone being the same cores, roughly the same clock speed, just more of them.

Lather rinse repeat until they don’t detect a breeze LOL
 
Unit sales ... what most people buy. (i.e., money. and a desire to get a targeted return on their investement).
The vast bulk of Mac sales are to laptops. So Apple's SoC , which are only going into Macs, are focused on what they mostly sell.

Not just Apple-thing though. The vast bulk of Windows sales are to laptops also. It is just a bigger pond. Apple has around 10% of worldwide market and Windows 90%. 5% of 90% is 4.5% (approximately 5% ) . 5% of 1% is 0.05% ( approximately 0% ). A very small niche of a small niche is as Steve Jobs once put it "Nobody buys it".



Those two are not entirely decoupled. The ultra is a "not so great" chiplet design mainly because it is hobbled to being optimized to being a monolithic laptop die. It is only pressed into a 'side hustle' of pretending to be a chiplet as a tacked on 'add-on'.

The Intel Mac Pro was extremely dependent upon Xeon SP ( server ) sales of that base silicon to be viable. There are no Lunix/Windows Server unit sales to carry the bulk of the volume for the M-series based Mac Pro. Apple going with their own home grown chip means they are completely detached from the Arm server market ( Amazon and Ampere are doing reasonably well there. No real 'hole' for Apple to fill there even if they were in the server business; which they are/were not even in the Intel era when it was even easier to do. ) .

The new Mac Pro isn't a joke. However, it is not looking to the past for inspiration either. Hypermodularity isn't a focus. Old application and driver architectures from 10 years ago are not either. Apple is extremely focused on optimizing the Apple GPU stack. Period. That has impact across just about all of their product lines ( not just Macs but even in Mac space ... the vast bulk of all software is deployed were Apple GPU is extremely dominant; laptops and the lower 'half' desktop space. ). There is a reality of who is actually paying for the work. ( those systems are where the revenue basically is. Money 'talks'. )

The Mac Pro isn't going to cover the same workload space as it covered before. But it also isn't a 'joke' small space either. It really wasn't relatively all that large to begin with anyway.
I feel like everything you are saying is based on what actually exists now, rather than on what could be.

As for the Mac Pro. The 2019 Intel Mac Pro came about because Apple started losing corporate and creative customers in the years following the 2016 Butterfly Generation releases (e.g. I used to follow a famous professional photographer who put out regular blogs on his techniques and equipment. I was appalled, and yet not at all surprised, when he ditched Apple for Windows, due to how far behind Apple was falling in hardware quality and capability, and creative software updates).

Apple, to their credit, created a research working group, in which they invited professional users in, gave them space and equipment to work with, and observed them, and asked them questions about what the do and what they want and need.

This resulted in massively improved hardware such as the 2019 Intel Mac Pro, and the 2019 16" MBP. The Mac Pro in particular was a seriously impressive machine, with a huge array of self-upgrade parts and options, including thing such as up to 1.5TB of RAM.

The current AS Mac Pro, sheesh, it is a shadow of the 2019 machine. Massively hamstrung by the restrictive limitations on customising and upgrading parts and options.

The overriding problem I see is that Apple is so damn focused on profits above all else, that it hasn't the imagination to create products purely for the joy of creating great products for niche markets, even if the revenues aren't super massive.

For example, who cares if the iPhone Mini only sells to 1% of iPhone customers. 1% of uberbazillions of total iPhone production is still an impressive number of phones. The sales volume of the iPhone Mini would still be so large that many smaller phone companies could only dream of such numbers. Sure, it wouldn't have added much to Apple's bottom line, but I bet it didn't actually make a loss, and even if it did, is it such a bad thing for a giant like Apple to have a little side line loss leader that gains good will from the 1% of customers that vehemently wish for a smaller physical phone size?

Same goes for Apple's discontinued, but much loved, wifi router. And so on and so on.

Same goes for all the customers that wish for the 12" laptop footprint.

And what is with the infatuation with somehow assuming that customers who want pro features also want large form factors, and vise versa. Why was there no iPhone Mini Pro? Or until recently 15/16" MBA (thankfully, they did just put out the 15" MBA, about bloody time)? But then, why is the MBP in the slightly larger 14/16", and the MBA a smaller 13/15"? Why did they have to kill the 17" MBP so many years ago? Why do you have to have the M2 Pro chip to get more than 24 GB RAM? And the M2 Max chip to get more than 32 GB RAM? Chip power and RAM do NOT necessarily go hand in hand. RAM size holds more data, that's all, it doesn't mean you have a need to crunch swaths of that data with complex algorithms, it may just mean you simply want to view it fast. Why can't you get more than 2TB SSD on a MBA?

Apple seems to cull any product that doesn't clean up a ton of profits. It's a damn shame. Just let these little departments run. No harm done, and a handful of customers absolutely love these products.
 
The Cynic in me thinks Apple bought all the 3nm TMSC space to try and "buy time", to stagnate progress from Nvidia and AMD.. to allow Apple to close the gap a bit more with their igpu's. Apple is at least 3 yrs behind on the GPU front compared to Nvidia and needs to take big steps to actually close that gap. Right now you are paying $4-5K to get the mid-tier last gen Nvidia performance.. AI/ML its even worse.. Apple is probably closer to 5 yrs behind from what I have seen.. an Ultra comparing more to a GTX 1080 in ML training and Inference, than the normal 3070ti comparison at 3d.
 
  • Like
Reactions: George Dawes
Everytime they add new cores developers will have to update or rewrite software to utilize those cores. We find this on the PC side. Where much of the software isn't written to even utilize more than a single or second core. I think with every upgrade cycle apple software developers are going to have to go back to the drawing board to incorporate the new cpu and GPU cores. Or just ignore they are there because the addition of a few new cores really don't advance the speed or rendering or processing any important way.
Cool the GPU has 26 cores.
In well written programs they create threads based on how much the cpu can handle. The big jump was having multiple threads, and somewhat at going from 2 to more. Once the program is adapted to that, there is not much work with any more jump. The question is how much does it benefit from more threads, and it depends on the task (how much of the work is strictly serial) and the overhead of handling multiple threads.

Anything running on GPUs is easily parallelizable by default, I'd be surprised if there is any issue there. Games are little different, but the problem is on the CPU side and from the requirement of having fast feedback to user input.
 
  • Like
Reactions: spcopsmac21
That's a f*cking shame that people are forced to use third party gadgets for this. At least implement the bloody display port daisy chaining in MacOS finally.
DisplayPort daisy chaining wouldn’t have helped, the GPU still needs to have the 3rd display buffer to render the screen, which the M1 and M2 don’t have.
 
Which is a joke for the price we’re playing. $400 machines can do it.
I agree they really should have had that. In 2020, even the i3 Intel MBA could output to 2 (or more) displays, while months later the M1 Air couldn’t, which is more or less the only thing downgraded.

That said I don’t see Apple too eager to, since the same M1/M2/M3 chip also goes into iPads and iMacs, where 99% of the target audience aren’t expected to ever use more than one ext screen. Adding an otherwise unused display buffer is just wasted silicon budget in other words.
 
No one expects a person who owns an M2 would upgrade to an M3. That would be almost pointless.

The person buying an M3 is likely upgrading from a 3 to 6 year old Mac that would still have an Intel chip inside.

I'd be surprised if there's a significant number of people who upgrade a Mac from generation-to-generation.
I completely agree guys, I'm just merely stating that you'd expect performance increases every generation.
 
I agree they really should have had that. In 2020, even the i3 Intel MBA could output to 2 (or more) displays, while months later the M1 Air couldn’t, which is more or less the only thing downgraded.

That said I don’t see Apple too eager to, since the same M1/M2/M3 chip also goes into iPads and iMacs, where 99% of the target audience aren’t expected to ever use more than one ext screen. Adding an otherwise unused display buffer is just wasted silicon budget in other words.
I don't think the silicon costs would be particularly substantial. Intel CPUs have been able to do this for a long time with only a tiny fraction of the transistor counts, and the iGPUs in these Intel chips were even smaller. These were seriously tiny compared to the M1/M2 chips.

If Apple wanted to put these in there, they could have very easily done so without really making many sacrifices to do it. At first I thought it was just a first generation product limitation, but after the M2 came out with the same limitations, I have started to believe this is much more likely an upselling thing for Apple.
 
I have an M1 Mac Studio Max 32 GPU. I will only upgrade this computer to an m3 Max if I see a significant improvement in both CPU and GPU. At this moment, it looks like CPU will be a lot faster with the M3 chip but I still have doubts about the only 40 GPU cores at maximum. We'll see. I am looking forward to benchmarks.
 
If Apple wants to get out of the Mac sales niche in comparison to Windows PCs, they must deliver at higher AMD/Nvidia 4080 performance level (M3 Ultra).
Very few people will buy a Mac for gaming with the current Apple tax rate. I can buy a MacBook Pro for work and a PC with top hardware for less than the price of the current M2 Ultra Studio.

For a lot of professionals, that level of performance would mean nothing unless it has Cuda capabilities. Well, anything Apple Silicon won't have it.

And finally, Apple Silicon itself is a problem. What works wonders in lower segments makes the higher end messy. I'd be happy to have a 24-core cpu, but I'd be fine with an 8-core gpu. For others, they'd be fine with 8 cpu cores, but would wan't the 60-core gpu. A lot of people starts paying for unnecessary hardware at a very high premium. But Tim Apple worships the God of supply chain management, and that is an unforgiving God.

I love to use Macs, and I like how they integrate with my other devices, but with their current pricing and product situation what makes sense is to buy the cheapest possible Mac and use it as thick client to a much more powerful pc/vm.
 
And finally, Apple Silicon itself is a problem. What works wonders in lower segments makes the higher end messy. I'd be happy to have a 24-core cpu, but I'd be fine with an 8-core gpu. For others, they'd be fine with 8 cpu cores, but would wan't the 60-core gpu. A lot of people starts paying for unnecessary hardware at a very high premium. But Tim Apple worships the God of supply chain management, and that is an unforgiving God.
That's part of what I liked about the idea of having the unbinned M2 Pros for the laptops. You can still get virtually the same CPU performance without the extra GPU cores if you wanted it (which is how I configured mine).

If I were Apple, I'd take advantage of the fact that Intel is desperately trying to earn Apple's business back. I'd say "hey, you already know how to write GPU drivers for MacOS since we have millions of Intel ones that were shipped with your Iris Pro chipsets. Think you can take some of that work and port it to Apple Silicon for your Arc GPUs? (for the Mac Pro, where there are PCIe slots)"

Probably wouldn't be the most profitable venture for either party, since the majority of people would just stick with the ones integrated into the M2 Ultra series chipsets. That's part of the problem. There wouldn't really be anything stopping Apple from allowing it, but a company would have to feel like they could sell enough GPUs for a very limited and niche market for it to be worth it for them to create the drivers.
 
If I were Apple, I'd take advantage of the fact that Intel is desperately trying to earn Apple's business back. I'd say "hey, you already know how to write GPU drivers for MacOS since we have millions of Intel ones that were shipped with your Iris Pro chipsets. Think you can take some of that work and port it to Apple Silicon for your Arc GPUs? (for the Mac Pro, where there are PCIe slots)"

If they go the third party gpu way, it should be nvidia because of Cuda.
 
  • Like
Reactions: Jack Burton
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.