Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

seek3r

macrumors 68030
Aug 16, 2010
2,561
3,772
A person in our research group requested more storage and a 2nd GPU to increase our DL experiment workload. I installed an 8TB sabrent nvme in a PCI express card as well as an RTX A6000 GPU in our dell workstation (the other workstation we have is a custom build with a threadripper). I then setup linux drivers and had the A6000 running workloads in pytorch within an hour because Dell didn't throw a tantrum/hissyfit at nvidia and refuse to sign drivers thus NOT screwing over all their customers unlike our favorite company.

Finally, I also installed a 1060 we have floating around the lab to serve as a GPU to output video for the dell workstation, if needed, so the A6000 can focus purely on compute tasks.

A routine situation using a prebuilt workstation we bought to add parts for additional functionality of the computer for years to come.

This scenario is apparently too beyond what apple can (read: wants) to provide. Which is more sad and pathetic? apples behavior or the apple apologists stating that this situation isn't "professional" because apple said so. I'm not quite sure.
Again, as a lot of folks have pointed out including me, Apple’s been aiming at large scale corporate buyers with the Mac Pro since 2019, IT at a big company doesnt do custom builds like that unless there’s a very very very specific reason for it that cant be satisfied by a prebuilt solution. A uni lab or even a small research group within a company isnt the target market here.

You have 2 workstations, the team of 14 people I’m on has at least 5 desktop workstations for people who need it, MacBook Pros or Dell XPS laptops for everyone, a million or so dollars of racked hardware, and millions of spend on cloud infra. If a designer in an adjacent team wants a machine if they ask for a mac pro it’s quite likely a decent bet that itll be purchased and provisioned no questions asked beyond standard approvals, but if they ask for a custom machine IT has to build from parts it would be a nightmare to get
 
Last edited:
  • Like
Reactions: ddhhddhh2

avro707

macrumors 68020
Dec 13, 2010
2,263
1,654
Again, as a lot of folks have pointed out including me, Apple’s been aiming at large scale corporate buyers with the Mac Pro since 2019, IT at a big company doesnt do custom builds like that unless there’s a very very very specific reason for it that cant be satisfied by a prebuilt solution. A uni lab or even a small research group within a company isnt the target market here.

You have 2 workstations, the team of 14 people I’m on has at least 5 desktop workstations for people who need it, MacBook Pros or Dell XPS laptops for everyone, a million or so dollars of racked hardware, and millions of spend on cloud infra. If a designer in an adjacent team wants a machine if they ask for a mac pro it’s quite likely a decent bet that itll be purchased and provisioned no questions asked beyond standard approvals, but if they ask for a custom machine IT has to build from parts it would be a nightmare to get
My lot has had custom builds before and our organisation is enormous and our budgets are many times what you’ve said.

It’s no hassle to buy the custom builds from the suppliers, and the suppliers support the hardware - while inside it just gets a version of the standard operating environment.

We used to have Mac Pros and iMac Pros (there are still a few left) but they are probably all on the way out to be replaced with PC workstations. Macs are fairly restricted to laptops.
 
  • Like
Reactions: singhs.apps

seek3r

macrumors 68030
Aug 16, 2010
2,561
3,772
My lot has had custom builds before and our organisation is enormous and our budgets are many times what you’ve said.

It’s no hassle to buy the custom builds from the suppliers, and the suppliers support the hardware - while inside it just gets a version of the standard operating environment.

We used to have Mac Pros and iMac Pros (there are still a few left) but they are probably all on the way out to be replaced with PC workstations. Macs are fairly restricted to laptops.
The budget I mentioned is just my 14 person group, not the larger company

Also I guess you dont have to deal with a regulatory environment, that complicates the kind of builds you’re talking about - a lot

As far as scale goes I’ve worked in places with close to half a million employees, “enormous” is relative
 

ddhhddhh2

macrumors regular
Jun 2, 2021
242
374
Taipei
Okay, so the M2 MP release still has a bunch of folks seriously ticked off. How to put this... I'm still crazy nostalgic for the PPC G4 "workstation". To be exact, when the B&W G3 dropped, it was like, bam! I genuinely thought Apple knocked it outta the park.

Now as time's moved on, we're in the Silicon era. Even though I miss rolling up my sleeves and getting my hands dirty with those "heavy-duty gadgets", I gotta face the facts. Silicon is gonna flip our old understanding on its head.

Sure, right now it seems a bit half-baked, but it probably meets over 80% of users' needs (just guessing here, it could actually be way more). So what about the rest of the users? If they still wanna roll with Apple, all they can do for now is sit tight. There's no other way around it. But I gotta say, I'm kinda hopeful about where Apple is headed.

But there's this one thing that's always had me scratching my head. I think it's about "why things are the way they are". Like, why does Apple have the market share it does in the workstation market? It's kinda like the whole chicken or the egg deal. If Apple had made a "real workstation" back in the day that catered to pro users, would they have a bigger slice of the pie? And if they did, then wouldn't this idea that "only a tiny fraction of users actually need this" just not exist?

But here's the thing. Right now, among Mac users, there's only a sad little percentage who are after "heavy-duty gear". So it's pretty clear where Apple's gonna focus their efforts.

This is all just food for thought, not a question of right or wrong. But it's been bugging me for ages. Technically and practically speaking, I can't believe for a second that Apple can't whip up what folks want. Well, Dell and HP are peddling this stuff left and right. Sure, we know a workstation isn't just about hardware. It's a whole solution and the needs can vary. But seriously, I'd be hard-pressed to believe that Apple can' t pull it off.

So in the end, we circle back to my initial question. Is it because they didn't make it that their share dropped, or did they stop because their share was just too low? Now ain't that a curious thought.
 

impulse462

macrumors 68020
Jun 3, 2009
2,097
2,878
So in the end, we circle back to my initial question. Is it because they didn't make it that their share dropped, or did they stop because their share was just too low? Now ain't that a curious thought.
The design team started to get extremely hyped in the late 2000s and early 2010s which led to the trashcan design which then languished because tim cook and other MBA types saw this as a small market and thus for 6 long years as there was massive innovations and gains in GPU architecture/power, a large amount of workstation users had absolutely no use for such a uselss machine and a big segment of mac desktop workstation users migrated to windows/linux. And since this group of users values stability over experimental designs (including half baked mobile chip ideas being paraded as "powerful***" as other desktop solutions), they have no reason to switch back now that the damage is already done.

Then apple will throw their hands in the air and claim "oh the market is small" but its because of their own doing and its marketing spin to cover up for the fact that 1) they let their pro workstation languish because of the designers and marketers and 2) if they acquiesce to having cuda and metal, theyll quickly realize no one wants to develop using metal unless they are forced to because cuda is superior if only because it has the most HW/SW support.

Seriously about metal - other than 1) being an apple fanboy 2) writing programs specifically for apple devices, is there any reason to use this framework over cuda? (this is a rhetorical question)
 
  • Like
Reactions: ddhhddhh2

avro707

macrumors 68020
Dec 13, 2010
2,263
1,654
Then apple will throw their hands in the air and claim "oh the market is small" but its because of their own doing and its marketing spin to cover up for the fact that 1) they let their pro workstation languish because of the designers and marketers and 2) if they acquiesce to having cuda and metal, theyll quickly realize no one wants to develop using metal unless they are forced to because cuda is superior if only because it has the most HW/SW support.

If Apple wanted to they could be a powerhouse in the workstation market if they were more open and supported different GPUs and some other different standards along with their own architecture. Something built that covered all of those needs with the beautiful build of the Mac Pro 2019 would be great.

But they've closed it all down and locked the thing up solid, along so previous customers have gone elsewhere. They can buy something easily that is mega-powerful, able to be upgraded easily and attractive price.

Machines like the Lenovo PX are taking over. It's a nicely engineered machine. If I was buying now, I would have got that instead of the MP2019.
 
Last edited:

257Loner

macrumors 6502
Dec 3, 2022
456
635
Don't worry, OP. Once no one buys the new Mac Pro seeing as how it's 75% pricier than the Mac Studio and no faster, Apple will discontinue a product that won't sell.
 

drugdoubles

macrumors 6502
Jul 3, 2023
430
356
Don't worry, OP. Once no one buys the new Mac Pro seeing as how it's 75% pricier than the Mac Studio and no faster, Apple will discontinue a product that won't sell.

Very few would pay so much more for the pcie slots or just to look cool, but the main purpose of this thing is to make you feel studio is cheap and buy it.
 
  • Haha
Reactions: ddhhddhh2

drugdoubles

macrumors 6502
Jul 3, 2023
430
356
The design team started to get extremely hyped in the late 2000s and early 2010s which led to the trashcan design which then languished because tim cook and other MBA types saw this as a small market and thus for 6 long years as there was massive innovations and gains in GPU architecture/power, a large amount of workstation users had absolutely no use for such a uselss machine and a big segment of mac desktop workstation users migrated to windows/linux. And since this group of users values stability over experimental designs (including half baked mobile chip ideas being paraded as "powerful***" as other desktop solutions), they have no reason to switch back now that the damage is already done.

Then apple will throw their hands in the air and claim "oh the market is small" but its because of their own doing and its marketing spin to cover up for the fact that 1) they let their pro workstation languish because of the designers and marketers and 2) if they acquiesce to having cuda and metal, theyll quickly realize no one wants to develop using metal unless they are forced to because cuda is superior if only because it has the most HW/SW support.

Seriously about metal - other than 1) being an apple fanboy 2) writing programs specifically for apple devices, is there any reason to use this framework over cuda? (this is a rhetorical question)

Cuda dominates the AI market and Greedia card is the only choice
 

257Loner

macrumors 6502
Dec 3, 2022
456
635
Except... a M2 Extreme would have had 48 CPU cores and 384GB RAM. M3 Extreme might up that to - what - 512GB RAM? It will still be limited by how many LPDDR packages will fit on/around the SoC dies - and there's still a topological mystery as to how the ultrafusion link will expand to 4 dies and leave space around them for RAM connections. The number of PCIe lanes won't necessarily change - 16 per die (with the first die's 16 used for the Apple SSDs and on-board peripherals) so 48 (still less than the 2019 MP but, of course PCIe4). Maybe the M3 will get PCIe 5... Or maybe M3 will be little more than the M2 designs with the minimum of changes needed to build them using 3nm technology.

Meanwhile the competition is already offering 64 cores, 2TB of RAM, 128 lanes of PCIe plus established support for those pesky NVIDIA and AMD GPUs that every True Pro seems to want, but which won't happen on Mac until Apple U-turns on their Apple Silicon GPU only policy and buries the hatchet with NVIDIA.

Basically Apple just don't have a horse in the high-end workstation CPU race and even gluing 4 MacBook Pro CPUs together won't solve that. Pre Apple Silicon, they could buy one from Intel. Now they'd have to design a whole new die - an expensive undertaking - that U-turned on the Apple Silicon principles of unified RAM and integrated Graphics/Media/Neural processors, just for a small and shrinking niche who need high-end GPU and massive RAM for MacOS-only workflows.

Where are they going to focus their design effort - adding more PCIe lanes for the Mac Pro niche which will sit unused on every other Mac, or, maybe, squeezing in another core or two for the MacBook Pro market which shifts an order of magnitude more units? Apple Silicon is a great product for everything from the iPad through the studio (and the iGoggles of course) - but if you want a high-end workstation it just isn't the tool for the job.

There was clearly a modest market for a Studio Ultra with Slots which Apple could satisfy with minimal R&D costs, recycling the 2019 case (which was probably the most expensive part of the MP to design and tool up for). The so-called "scam" of 16 PCIe 4 lanes shared between 8 slots is actually a hefty upgrade from what external TB-to-PCIe cages offer. Yeah, I think that maybe they should have called it the "Studio Tower" or something rather than try and pass it off as a successor to the Mac Pro - but my surprise is that they even bothered, not that they didn't whip out a magic new Threadripper-killer as "one more thing".
Reading your insightful explanation, I began to think that if Apple did make a 4-chip “Extreme”, they’d stick it in the Mac Studio, paint it black, and call it the new “Pro”. The 2023 Mac Pro won’t last. Apple won’t keep a product that doesn’t sell.
 

ori69

macrumors member
Mar 10, 2022
47
25
At first I was like "well that's not too bad" and then I remembered that's Apple Silicon against a 5 year-ish old CPU.

That's why I for Logic prefer Intel processors.

LogicBenchi913900KveMPm2Ultra.png
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Reading your insightful explanation, I began to think that if Apple did make a 4-chip “Extreme”, they’d stick it in the Mac Studio, paint it black, and call it the new “Pro”. The 2023 Mac Pro won’t last. Apple won’t keep a product that doesn’t sell.
You have to assume that Apple do some market research and that they've identified a market for a Mac Studio with non-GPU PCIe slots & better bandwidth than a TB3 adapter - a need which they could satisfy without major R&D. I guess that niche will continue until the A/V users migrate to dedicated Thunderbolt/USB4 modules. If they've released the 2023MP just to avoid having to say "There's not going to be an Apple Silicon Mac Pro" then I think that's a huge mistake, because its encouraging people to make an apples-to-oranges comparison between the "Studio Ultra Plus PCIe" and the 2019 Mac Pro.

As for the Mx Extreme - we'll never know why the rumoured M1/M2 Extreme never appeared, but apart from speculating about technical issues its possible that the results simply weren't worth it - the problem is that you have to go x4 (no more, no less) on everything - performance cores, efficiency cores, GPU cores, neural engine, media engine, RAM, PCIe lanes... whereas a 3D/ML workstation needs 8x the RAM and PCIe and 0x the Apple GPU cores (if its going to use dGPUs), while a Mac Studio Extreme would be "wasting" 48 PCIe lanes and no desktop really needs more than 2-4 efficiency cores. Throwing in more CPU and GPU cores of the same type will also bring diminishing returns for any workload that isn't optimised for extreme multi-threading.

If you look at the NVIDIA Grace/Hopper chips - which bear some resemblance to a scaled-up version of the Apple Silicon/Ultrafusion concept - one important difference is that whereas Apple build everything from a single, does-it-all SoC (the Mx Max) they have separate CPU and GPU chips and offer a choice of CPU-CPU or CPU-GPU combos (...and, I guess, if they ever did a 4x version they'd be able to pick'n'mix CPUs and GPUs for specific applications).
 
  • Like
Reactions: 257Loner

TheDauterive

macrumors regular
Aug 8, 2013
104
188
private
Well, such an accessory for the Mac Studio has just arrived and Apple has named it the Mac Pro.
So...you can connect the Mac Pro to the Mac Studio and then the Mac Studio...the PCI...what?

I mean, I get that you're going for clever, but if it makes no sense whatsoever, how clever is it?
 

Corefile

macrumors 6502a
Sep 24, 2022
756
1,072
I’ve been curious about what workflows are run on a desktop Mac these days that require more than 192gb of ram. These days I’d expect that to be offloaded to a Linux server(s). Does anyone have this workflow or know anyone who does?
Hardware vendors generally don't know what customers are buying such hardware for. That is often trade secret so all the hardware vendor knows are these are the hardware requirements. I had a 1TB system on my desktop 15 years ago so there are lots of applications that require it but are not disclosed - and never will be.

Any server or critical hardware without ECC in memory support just doesn't make the cut. This isn't a hardware vendor driven mandate so much as a hard customer requirement.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Any server or critical hardware without ECC in memory support just doesn't make the cut. This isn't a hardware vendor driven mandate so much as a hard customer requirement.
I guess what Apple has to guesstimate is what part of the "must have ECC" market doesn't also have red-lines like "must have lights-out", "must have redundant PSUs", "must have x86", "must have Windows", "must have CUDA/NVIDIA" etc., whether they can offer something distinctive in those markets and whether it is worth their while.

Apple haven't made any "proper" server or high-density computing hardware since they dropped the XServe many years ago. So that's a big chunk of the market that they voluntarily walked away from. Why? When the XServe was launched it was unique in that (a) it was PPC based (back when PPC was ahead or the game), (b) it ran MacOS X which was certified as Unix and cost a lot less - and with no per-user license fees - than commercial Unixes, Windows Server, Novell Knitware etc. and (c) it supported Mac-only protocols like AppleTalk/AFP - and other open-standards like SMTP/POP/IMAP for mail - which were much more usable for Mac workgroups than proprietary Windows/Netware ones. Come 2010, Apple have gone Intel, XServe had become, technically, little different from any other Xeon-based server hardware, Linux has developed into a serious, free, competitor to Windows and had largely eclipsed Unix (making MacOS's Unix certification far less relevant) - and as for Mac workgroups, AFP/Appletalk had been depreciated, Mac Mail had added Exchange support etc. The XServe no longer had any real appeal outside of a dwindling pool of existing users (OK, it looked really good, but that's not a big deal for something that lives in a rack in a machine room).

Similar with printers - Apple didn't quite invent the laser printer, but they produced the first remotely affordable - especially since it could be shared by a Mac workgroup - laser, and had a big hand in the popularisation of PostScript and DTP in general. Then, for a long time, personal PC printers used Centronics parallel while Macs used LocalTalk/RS423. However, come the day that PCs could network printers (and Macs had jolly well better be able to access those) and when Macs could plug into almost any HP/Canon/Brother etc. via USB or Ethernet - and Apple drop printers like hot garbage.

Ditto WiFi routers, etc. Their early Airport stuff was truly "just works" but now 99% of domestic users just use whatever their ISP gives them, while commercial users install kit from whoever's reps have the nicest suits - and, again, Macs better jolly well "just work" with Cisco or Linksys. Bye bye Airport.

Now, Apple Silicon is pretty darn successful in Apple's money-spinning consumer/pro-sumer products, while what most "pro workstation" users want and need is a "me too" Xeon/Threadripper tower.

Apple don't do "me too".
 
  • Like
Reactions: ddhhddhh2

257Loner

macrumors 6502
Dec 3, 2022
456
635
If you look at the NVIDIA Grace/Hopper chips - which bear some resemblance to a scaled-up version of the Apple Silicon/Ultrafusion concept - one important difference is that whereas Apple build everything from a single, does-it-all SoC (the Mx Max) they have separate CPU and GPU chips and offer a choice of CPU-CPU or CPU-GPU combos (...and, I guess, if they ever did a 4x version they'd be able to pick'n'mix CPUs and GPUs for specific applications).

Now, Apple Silicon is pretty darn successful in Apple's money-spinning consumer/pro-sumer products, while what most "pro workstation" users want and need is a "me too" Xeon/Threadripper tower.

Apple don't do "me too".
It makes me wonder whether Apple scaling up their A chips into M chips, and their M chips into Pros, Maxes, and Ultras will be competitive for very long? Intel and Nvidia aren't sitting ducks. They'll continue to innovate and sell processors that not only are fast but also power-efficient. Then Apple will be stuck with processors that are power-efficient but not as fast as the competition unless they can find a way to make the performance of their chips more competitive.
 
Last edited:

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Intel and Nvidia aren't sitting ducks.
No, but they're playing by the same laws of physics. Nvidia's ARM chips aren't competing in the same market as the Mac (I guess they may collide with Apple in areas like AR and automotive) and Intel either have to stick with the fundamentally more complex x86 ISA or go for a new ISA and throw most of their existing business under a bus.

Apple managed a switch to Apple Silicon partly thanks to (a) a lot of experience and valuable IP thanks to their iPhone business (Intel kinda missed the boat on mobile) and (b) a flexible user base and software ecosystem that had far fewer legacy drag-anchors vs. Windows (Windows users have an expectation of being able to run 25-year-old binaries, Apple has been through two major, successful, migrations - Classic to Mac OS X and PPC to Intel - in that period).
 
  • Like
Reactions: ddhhddhh2

257Loner

macrumors 6502
Dec 3, 2022
456
635
As for the Mx Extreme - we'll never know why the rumoured M1/M2 Extreme never appeared, but apart from speculating about technical issues its possible that the results simply weren't worth it - the problem is that you have to go x4 (no more, no less) on everything - performance cores, efficiency cores, GPU cores, neural engine, media engine, RAM, PCIe lanes... whereas a 3D/ML workstation needs 8x the RAM and PCIe and 0x the Apple GPU cores (if its going to use dGPUs), while a Mac Studio Extreme would be "wasting" 48 PCIe lanes and no desktop really needs more than 2-4 efficiency cores. Throwing in more CPU and GPU cores of the same type will also bring diminishing returns for any workload that isn't optimised for extreme multi-threading.

If you look at the NVIDIA Grace/Hopper chips - which bear some resemblance to a scaled-up version of the Apple Silicon/Ultrafusion concept - one important difference is that whereas Apple build everything from a single, does-it-all SoC (the Mx Max) they have separate CPU and GPU chips and offer a choice of CPU-CPU or CPU-GPU combos (...and, I guess, if they ever did a 4x version they'd be able to pick'n'mix CPUs and GPUs for specific applications).

At the moment, Apple engineers their top-of-the-line Ultra chip using UltraFusion (brand name), which is TSMC's CoWoS-S (chip-on-wafer-on-substrate with silicon interposer) technology. If Apple hits a wall with scaling performance through doubling alone, Apple could instead opt to engineer an entirely unique chip that has the correct proportion of CPU, GPU, and NPU cores, and the correct number of other silicon features.

Would that be expensive? Absolutely. But underdog Apple from the early '00s is gone. Apple is ultra-competitive nowadays, so they'll do it if there's a business case for it.
 
  • Like
Reactions: ddhhddhh2

ChrisA

macrumors G5
Jan 5, 2006
12,919
2,172
Redondo Beach, California
Purely my personal opinion. All existing Apple Silicon computer shouldn't be defined as workstation. One of the main reason is lack of ECC memory.
When memory used to have errors, it was important to detect them. But when was the last time you were working and an Apple Silicon Mac had a memory error? Don't just say "it might happen" because also you might find a winning lottery ticket on the sidewalk on the 3rd Tuesday of an odd month. Could happen.

I think with the RAM on the chip and testing at the factory RAM errors are now so uncommon we don't need to worry much about them. The old-style RAM used to be placed on the end of a long bus made from PCB traces and needed buffers at each end and there was the possibility of loose contacts and such. The really old RAM that I used to use as a kid was in a separate rack and connected to the CPU with cables that ran under the floor. ECC was important in those days.

But even in the old days, ECC was not set up to detect tipple-bit errors. Why? Because engineers have always known that you have to make a call about when an error is so unlikely that you should not waste resources worrying about it.
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
Would that be expensive? Absolutely. But underdog Apple from the early '00s is gone. Apple is ultra-competitive nowadays, so they'll do it if there's a business case for it.
That’s the big “if” though - Apple’s main interest is in the consumer/prosumer market where Apple Silicon has an edge - and their priority is probably going to be keeping the edge in that sector. Theyre looking for the next iPhone, something that will spread the development costs over a gazillion sales, and that’s not going to be a “me too” tower workstation. At the moment, it looks like it’s the goggles, which have very different demands.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
When memory used to have errors, it was important to detect them. But when was the last time you were working and an Apple Silicon Mac had a memory error? Don't just say "it might happen" because also you might find a winning lottery ticket on the sidewalk on the 3rd Tuesday of an odd month. Could happen.
Memory errors are rare, but not that rare. As a rough ballpark figure, you can expect a bit flip every 24 hours on a laptop/desktop computer. If you are using the computer for typical consumer/media tasks, it probably doesn't matter. You may see a single incorrect pixel in an image or experience a minor glitch in video/sound. Or maybe a file appears to be corrupted (with 1-2 incorrect bits), but the corruption goes away when the memory is reused and you read the file from disk again. Or maybe an app crashes, but you probably can't tell that the crash was caused by a memory error instead of a bug in the app. Or in an extreme case, the OS itself may crash, but again you probably can't tell the difference between memory errors and driver/kernel bugs.

ECC memory becomes important if you care about long-term stability and getting things exactly right. That means servers, many scientific/engineering applications, and so on.
 

Seiko4169

macrumors member
Jun 18, 2012
90
53
England
That’s the big “if” though - Apple’s main interest is in the consumer/prosumer market where Apple Silicon has an edge - and their priority is probably going to be keeping the edge in that sector. Theyre looking for the next iPhone, something that will spread the development costs over a gazillion sales, and that’s not going to be a “me too” tower workstation. At the moment, it looks like it’s the goggles, which have very different demands.
I'd settle for an Apple foldable so I can bridge the iPhone and iPad experience with a single device. I have ZERO belief Apple will take such a step as they see 2 devices sold over 1 as the better 'outcome'. As a consumer though and similar to the Mac Pro group disappointed but accepting the Mac Pro is no longer for them, I moved to Samsung simply because they did cater for my needs. Apple continues to make money and impressively so but unless you're drinking the iPhone, iPad and MacBook koolaide you're not their primary focus!
 

theluggage

macrumors G3
Jul 29, 2011
8,015
8,449
ECC memory becomes important if you care about long-term stability and getting things exactly right. That means servers, many scientific/engineering applications, and so on.
Apple's problem is - however you implement it - ECC uses more RAM chips to do the same job, and the current Mac Silicon design is physically limited by how many RAM chips you can fit on the package. LPDDR can support "inline" ECC by taking some of the regular RAM to store the check bits, but that would mean less usable RAM - and the people who need ECC are often the people who need more than 192GB... Of course, the more RAM you use, the more important ECC becomes.

It would be interesting to know what the error rate is for the type of on-package LPDDR RAM that Apple is using (which has significant differences c.f. conventional, external DDR DIMMs) and whether it is significant with "only" 192GB or less RAM.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.