Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Appletoni

Suspended
Mar 26, 2021
443
177
Good point, I’d love to hear what the business case would be for all that ram in the first place. 64 Gb is a lot, 128 GB is twice a lot. But a TB? I suppose you could load your entire big freakin database and avoid SSD access. I suppose you could have hundreds of simultaneous queries on a web server (limited to physical cores) I guess to avoid SSD access

Seriously, anyone advocating for this, 1) can you spell out business case, and 2) define how big the market is?

Many thanks
I know thousands of people that can use and need 128 GB RAM or 256 or 512 GB RAM or 1 or 2 TB RAM.
Me too.
That’s nothing special.
 

k1121j

Suspended
Mar 28, 2009
1,729
2,767
New Hampshire
Raptor Lake will double the E-cores, adding even more multithreaded performance for the top i9 Raptor Lake 13900K. Intel isn't standing pat like the past 5 years, and Raptor Cove will add IPC as well for the big cores. AMD isn't standing pat either, Zen 4 ought to be a beast.

ARM vs x86!! x86 has more mileage left in the tank! And even if Apple Silicon comes close to or beats top x86, x86 is still more open than Apple Silicon, and gamers will choose x86 over Apple Silicon any day of the week. Macs while awesome machines, only command 7-10% of the market. If Apple wants to kill x86 then it will need to supply the Lenovos, Dells, HPs of the world with Arm (Apple Silicon) chips... such a move would be a serious blow to x86. But until then, x86 will remain dominant (in terms of market share).
Not sure that makes any sense you sound like those people when the iPhone came out.
 
  • Like
Reactions: JMacHack

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
Out of curiosity, what kind of workflows need 1TB or even 2TB of RAM?
Can be anything, scientific workflow is a frontrunner here, but it can be databases, photo and video work. Sure, not the average wedding photographer, but someone doing hobby astrophotography, can easily crack that with image stacking. It all depends on how much data you throw at it. PixInsight for example recommends a minimum of 64GB to 1TB, depending on your data. You can easily fill up more.

I know of one example where a test was done with a ton of astro data keeping around 60000 cores busy with 500TB of RAM. That was on a cluster of course.
Even if it’s DDR5, it can still be unified memory. It’s just that the number of slots would have to be huge.
Sure, you can build it, it's just not very practical anymore and slower than what Apple has now. We'll see where the MP goes in the future. They'll find some solution for their target market, which I still think is photo/video/music work.
 
  • Like
Reactions: Argoduck

DHf1

macrumors newbie
Oct 25, 2021
2
2

He makes a very compelling case for the iMac Pro & Mac Pro's Apple Silicon chips.

I still feel bad for 2019 Mac Pro owners. That desktop should have debuted in 2017 instead of the iMac Pro so that owners enjoys over 5 years of use being phased out.
Got to love all the speculation here! No one had a clue about the M1 max!
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
Sure, you can build it, it's just not very practical anymore and slower than what Apple has now. We'll see where the MP goes in the future. They'll find some solution for their target market, which I still think is photo/video/music work.

I agree that it wouldn’t be very practical (and I don’t actually think they’ll do this) but theoretically you could add enough slots that you’d have as good bandwidth as the on-package memory could provide - though this adds to the impracticality. Read/write latency wouldn’t be hurt either.

It’ll be fascinating to see what they come up with.
 
Last edited:

mi7chy

macrumors G4
Oct 24, 2014
10,625
11,296
Got to love all the speculation here! No one had a clue about the M1 max!

Nothing unexpected with the M1 Max. It was scaled out with more cores with higher TDP from the M1 as expected. Eyeing the M1 Max but gut instinct tells me to wait for next year's model with 3nm node shrink. Same reason for skipping Intel 11th and probably 12th gen too.
 

vadimyuryev

macrumors member
Oct 3, 2017
65
209
Don’t put words in my mouth. I do not have a negative stance against monetization.

Their video is based on rumors and assumptions, not facts or an Apple Silicon Mac Pro production release. What other motivation outside money-making drives the creation of such content?

Do you want an honest public forum discussion? Do it without incentives.
Vadim from Max Tech here..
Yes, our job is to make videos that appeal to our audience and what we think people are wondering about.
Does that make us biased? No.
This video was over 16 minutes long and took two days to make because I wanted to share all of my thoughts and speculations on how Apple could pull off the Mac Pro, based on what they did with the M1 Max MacBook Pros.

I could’ve easily made the video just over 8 minutes long to enable mid-roll ads to make the most amount of profit vs the time invested.
I made it over 16 because I’m passionate about sharing my thoughts on the Mac Pro.
An I always right? No.

But I was the only one on YouTube who was consistently telling people time after time that this new MacBook pro would be the best laptop ever made when you consider the entire package and resale value. Not many believed me.
I also said that it would bring AAA gaming to the Mac. Are all games supported? No, not yet at least. But you can play higher-end AAA games now, even with Vulkan to Metal translation, as well as x86 to ARM64 Rosetta 2 translation.

The point is that I’m not sitting here making up lies to get views. The points that I make are reasonable, with explanations behind them. Will they all be correct? Of course not, but it gets people thinking. That’s the point.
Thoughts?
 

Adarna

Suspended
Original poster
Jan 1, 2015
685
429
The guys complained for months about their channel being demonetized. If monetization weren’t their priority, they’d been quiet about it.

I’m not saying monetization is wrong or hold that against them, but it’s essential to know these things in order to make informed decisions. If a channel's sole purpose is profit, their opinions are most certainly biased and skewed towards that.
Like any and every reviewer since print
 

Apple Knowledge Navigator

macrumors 68040
Mar 28, 2010
3,693
12,921
Vadim from Max Tech here..
Yes, our job is to make videos that appeal to our audience and what we think people are wondering about.
Does that make us biased? No.
This video was over 16 minutes long and took two days to make because I wanted to share all of my thoughts and speculations on how Apple could pull off the Mac Pro, based on what they did with the M1 Max MacBook Pros.

I could’ve easily made the video just over 8 minutes long to enable mid-roll ads to make the most amount of profit vs the time invested.
I made it over 16 because I’m passionate about sharing my thoughts on the Mac Pro.
An I always right? No.

But I was the only one on YouTube who was consistently telling people time after time that this new MacBook pro would be the best laptop ever made when you consider the entire package and resale value. Not many believed me.
I also said that it would bring AAA gaming to the Mac. Are all games supported? No, not yet at least. But you can play higher-end AAA games now, even with Vulkan to Metal translation, as well as x86 to ARM64 Rosetta 2 translation.

The point is that I’m not sitting here making up lies to get views. The points that I make are reasonable, with explanations behind them. Will they all be correct? Of course not, but it gets people thinking. That’s the point.
Thoughts?
Did anyone else read this in Vadim’s voice? Especially the “points”
 

throAU

macrumors G3
Feb 13, 2012
9,204
7,356
Perth, Western Australia
It doesn‘t defeat that unless the GPU needs access to that memory.
And that is massively likely for the workloads this machine would target.

And it’s not just GPU that would need access. The ML cores, the video transcode blocks, etc.

And if it isn’t in that first 256 GB guess what? You need to copy to that region. And possibly need to evict data out of that region, first. Even worse than if you simply had dedicated vram on a discrete card.

And you’ve created an arbitrary limit and incurred a design (and therefore software/library) quirk that doesn’t exist in the rest of the lineup.

Nah. I really suspect they will make the memory unified from a logical perspective and locally unified in each SOC.

Expandability? add more or swap the individual SOC units.

But Apple has never liked post sale upgrade anyway. Their high end customer base would tend to buy the spec they need and replace when it is financially written off.

Unlike for personal purchases, unless you’ve really screwed up with your original purchase, as a business consumer field upgrades are something you try to avoid; Justification to spend/project budget adjustment, purchase order process, waiting with lesser performance until the upgrade arrives, downtime while you upgrade the machines, etc. all of those things have significant costs which mean it’s often better to just go large on the original purchase.

It’s also more easily tracked by accounts for tax deduction/asset tracking purposes if you just buy the appropriate spec up front and at the end of the unit’s life write off the entire asset.
 
Last edited:

JouniS

macrumors 6502a
Nov 22, 2020
638
399
128 / 64 = 2 is a fact. It isn't rationally subject to "alternative facts" or "in my opinion".
It might be time to go out and rake some leaves , do some chores , or take some other break because if the basic principles of math have to change to make your argument ... you are off in the weeds. Way off.
I was talking about the difference between utility and an almost meaningless technical measurement. 128 GB is twice as much as 64 GB in the latter sense, but it's very unlikely that 128 GB is enough for something that can't be done in 64 GB.

The difference between 36 units and 37 units of RAM gives a much better idea of the actual impact of the upgrade. If a task needs 12 units of RAM, you won't see any difference. If it requires 26 units, 33 units, or 42 units, you won't see the difference either. There is only a narrow window of tasks in the neighborhood of 36-37 units of RAM that benefit from 128 GB over 64 GB.

In a similar way, $10 million is a lot of money, and $20 million is technically twice as much. However, $20 million does not give you twice as many options as $10 million, because the utility of money is rarely linear. Most of the options that are out of your reach with $10 million are also that with $20 million, because they are targeted for those who have $100 million, $1 billion, $10 billion, or $100 billion.
 

danwells

macrumors 6502a
Apr 4, 2015
783
617
I'm not an engineer (unlike several people on here), so I don't know how practical this is... How about a design with 256 GB of RAM on 4 M1 Max SOCs PLUS a whole bunch of DIMM slots. If nothing's in the DIMM slots, it behaves ike a very large unified memory Mac. If the slots are filled, the unified memory effectively serves as a huge, fast L4 cache. The problem is that it takes having 512 GB in the slots even to break even (half is a copy of the cache, and the other half is RAM). Realistically, it doesn't become worth it until you have a terabyte in the slots (with 768 GB accessible).

The other possibility (that somebody mentioned) is a backplane design with cards that contain 4 M1 Max SOCs with 256 Gb of RAM. You can't go above 256 GB per card that way, but you DO get the opportunity to put four cards in a box and get a terabyte of RAM.

If Apple's really clever (and I don't know that this is electrically possible - bus speeds may make it impossible - can you run something that fast outside a case, even on a wide snap connector over a short distance), how about a base box that contains I/O and one or two CPU cards, but has a connector that allows it to snap to an expansion unit that can hold up to four CPU cards. The M1 Max is so efficient that power's not a problem. The base unit needs a 200 or 300 watt PSU, while the expansion unit only needs 600 watts or so with a full complement of four quad CPU cards. 1500 watts (which will still plug into a standard office outlet) buys 400 CPU cores, 1280 GPU cores and 2.5 TB of RAM with a base and two expansion units. If that's not enough, you need a real supercomputer, not something that's air-cooled and runs on a 20 amp breaker.

If nothing about this is ridiculously expensive to engineer, the base unit with one CPU card might sell for $7,000 -$10,000. Add an extra CPU card for $5000. Expansion unit is $10,000 with one CPU card. Yes, a full on configuration with a base, two expansion units and ten CPU cards is a little over $100,000 - but it's also a supercomputer. It should have something like 400 TFLOPS of GPU performance, which would have qualified for a spot on the Top500 supercomputer list as recently as the end of 2016.

The advantage of a design like this is that the vast majority of photographers, video editors and the like will be happy with a base unit with one CPU card. As requirements creep up, throw in a second card for 80 CPU cores, 256 GPU cores and half a terabyte of RAM. That's still a sub-$15,000 system that sits in a smallish desktop case and draws less power than its dual monitors. For the few who need huge power, toss an expansion unit or two on there.
 

ikir

macrumors 68020
Sep 26, 2007
2,176
2,366
A 4x Max Mac Pro would have some clear potential hardware advantages and disadvantages vs. a PC workstation.

The PC hardware advantages are in the upper-end configurations, and the RAM and GPU shortfalls could be addressed if Apple offered add-on RAM and GPU modues. In deciding on this, Apple will certainly look at what percent of its current Mac Pro sales use the highest RAM and GPU configurations.

Hardware advantages, Mac Pro
  • Extraordinary efficiency
  • Quiet operation
  • Task-specific hardware acceleration, which makes those specific operations run unusually fast
  • High single-core speeds, especially during multi-core operation, when compared to high core-count Intel Xeon and AMD Threadripper chips (the latter need to have reduced clock speeds to avoid overheating, particularly when all cores are running; that's much less of an issue with AS). This would give much faster operation for multi-core apps that can only utilize a limited number of cores.
  • Unified memory gives the GPU access to unusually large amounts of RAM

Hardware advantages, PC workstation
  • Much higher maximum RAM (unless Apple offers add-on RAM modules). A 4X Max will have 256 GB; Ice Lake can have up to 2 TB. Not sure about Threadripper, but it looks like its max is 1 TB.
  • Much higher maximum GPU performance (unless Apple offers add-on GPU modules). A 4X Max should have performance about comparable to a single A6000 desktop chip. Current PC workstations can be configured with up to three of these.
  • AMD's highest-performing multicore workstation CPU, the 64-core Epyc 7763, may have nearly twice the general multicore performance of a 4X Max. According to testing by https://www.anandtech.com/show/16778/amd-epyc-milan-review-part-2/5, the average of Spec 2017 INT and FP aggregate scores for the Epyc 7763 are 7.4 times that for the M1 Max (suggesting it would have 1.85 x the processing power of a 4X Max). OTOH, a 4x Max should about equal the fastest multicore Xeon Ice Lake processor.

There are also clear software advanatages and disadvantages to each, which aren't addressed here.
Unified memory, macOS, Apple Silicon destroys PCs with more ram. Even in ram stress test Apple Silicon delivers incredible results Instead Windows slowed down to stone speed.

 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
I was talking about the difference between utility and an almost meaningless technical measurement. 128 GB is twice as much as 64 GB in the latter sense, but it's very unlikely that 128 GB is enough for something that can't be done in 64 GB.

The difference between 36 units and 37 units of RAM gives a much better idea of the actual impact of the upgrade. If a task needs 12 units of RAM, you won't see any difference. If it requires 26 units, 33 units, or 42 units, you won't see the difference either. There is only a narrow window of tasks in the neighborhood of 36-37 units of RAM that benefit from 128 GB over 64 GB.
Every time you assert there is some arbitrary dividing line above which everyone who wants more memory needs at least a terabyte if not more, my eyes roll. Your imagination and experience are both lacking if you think that.

Every time you start talking about "units" of memory or compute power my eyes glaze over. That last paragraph I quoted is ludicrous. What does it even mean? What are these arbitrary, imaginary scales you're inventing? According to you, M1 has "37 units" of CPU, M1 Max has "38 units", and the hypothetical 4-die 40-core Mac Pro has "40 units". I want to know what color the sky is in a world where 40 cores only provide 8% more compute power than 10 cores (assuming everything but core count is equal).

Marginal utility is a useful concept, but you've twisted it into this weird belief that once a computer gets big enough for the general public, anyone who wants more must need 100 times more.

Back in the real world, the point of thinking about marginal utility is acknowledging that value depends on context. If the working set of your simulation's data is 80GB (working set ~= the set of bytes which are frequently accessed, and thus should remain resident in physical RAM for best performance), you will be very unhappy with just 64GB RAM, you'll love an upgrade to 128GB, and you'll get nothing out of 1TB. There are many real simulations with working sets in that range.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Do we even know how much memory an M1-based (or maybe M2 by the time this is released) can address? iirc 16 gb was a hard limit for the M1, and the next levels may only be able to address 64 gb total.

Personally, I don’t believe the memory will be on DIMMs. I’d make a prediction that it will max out at 256gb, 512 if we reeealy stretch it. I feel like having off-die RAM would be too much of a departure for Apple Silicon. Prepare the pitchforks.

I think what I’m most curious about is internal expansion, which I believe will make a return. It’ll likely be in the form of MPX modules though, maybe “compatible” with pci. The M-series already has the capability for thunderbolt 4 to pci-e, so it might just be simple. I think people who use pci cards for music production screamed at Apple enough to warrant the inclusion of slots.

Personal wishlist:
Socketed SoC, think PowerMac G4. Having the SoC on a daughterboard would be a compromise, but better than soldering it down.

Dedicated slots for more storage. Maybe just “dumb memory” since the controller is built into the SoC. They already sell the SSDs for the current Mac Pro on their site.

Some from of extendable memory.

The only 100% guaranteed prediction that I can make, is that the Apple Silicon Mac Pro is going to piss off a lot of people on MR.
 

jtopp

macrumors regular
Apr 27, 2010
132
104
Raptor Lake will double the E-cores, adding even more multithreaded performance for the top i9 Raptor Lake 13900K. Intel isn't standing pat like the past 5 years, and Raptor Cove will add IPC as well for the big cores. AMD isn't standing pat either, Zen 4 ought to be a beast.

ARM vs x86!! x86 has more mileage left in the tank! And even if Apple Silicon comes close to or beats top x86, x86 is still more open than Apple Silicon, and gamers will choose x86 over Apple Silicon any day of the week. Macs while awesome machines, only command 7-10% of the market. If Apple wants to kill x86 then it will need to supply the Lenovos, Dells, HPs of the world with Arm (Apple Silicon) chips... such a move would be a serious blow to x86. But until then, x86 will remain dominant (in terms of market share).
Why do people think that gamers are the target market for apple pro level products. Sure you CAN game on them but I don't know anyone with a Mac personally who is caring about running a FPS. Yeah the target Demo for MacRumors forums users are the type that might try to run a game or two but I don't think the the average customer cares that much. If you're a gamer, you want a machine that's upgradable so you can pop different video cards and upgrade the Ram etc and that's just not possible unless you have a Mac Pro and who buys a Mac Pro and a $6000 display to run Crysis?
 
  • Like
Reactions: JMacHack

Yebubbleman

macrumors 603
May 20, 2010
6,024
2,617
Los Angeles, CA
I still feel bad for 2019 Mac Pro owners. That desktop should have debuted in 2017 instead of the iMac Pro so that owners enjoys over 5 years of use being phased out.
I don't think you understand the target market audience of the Mac Pro.

Those that buy them are not looking at buying them from the standpoint of "is my computer the latest and the greatest", they look at it strictly as a matter of "will this tool help me do what I need to do and as quickly as I need to do it" and also "how long will I be able to have support for both my hardware, Apple's software, and the third party tools that I need to run on it".

Mac Pros last WAY longer than MacBook Pros and, for the most part, longer than most iMacs as well. They're tanks. The Xeons that have gone into them are tanks. Most Intel processor technology isn't designed to last as long as your average Xeon, nor is it rated to be supported as long. That's why the Mac Pros have, historically, had the longest support life of any Apple Mac-based product. Someone who bought a 2019 Mac Pro will get 10 years of use out of it, easily. There will surely be those that try and even succeed at getting 20 (despite eventually losing software support for the OS well before that 20 year mark).

Incidentally, there's a LOT of professional-grade software that probably shouldn't be run in Rosetta 2 (despite likely being usable in Rosetta 2) that STILL isn't native and probably won't be native for a few more years to come. Someone who bought a 2019 Mac Pro even as recently as two weeks ago (and really needed one of those over a 27" iMac or anything with the M1/M1 Pro/M1 Max), probably still made the right call in doing so.

Furthermore, something doesn't become useless just because something better than it comes out.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Every time you start talking about "units" of memory or compute power my eyes glaze over. That last paragraph I quoted is ludicrous. What does it even mean? What are these arbitrary, imaginary scales you're inventing? According to you, M1 has "37 units" of CPU, M1 Max has "38 units", and the hypothetical 4-die 40-core Mac Pro has "40 units". I want to know what color the sky is in a world where 40 cores only provide 8% more compute power than 10 cores (assuming everything but core count is equal).
The units are 2-based logarithms of the relevant quantity. For example, the M1 has 4 performance cores running at 3.2 GHz. Assuming 8 instructions/cycle and some marginal performance from the efficiency cores, we are at ~100 billion instructions per second, which is close enough to 2^37.

Making something use 10x more CPU time or RAM is often as easy as replacing a number with a bigger number. In many cases, that increase only yields marginal benefits. For example, by changing the fourth decimal in precision or recall.

If the working set of your simulation's data is 80GB (working set ~= the set of bytes which are frequently accessed, and thus should remain resident in physical RAM for best performance), you will be very unhappy with just 64GB RAM, you'll love an upgrade to 128GB, and you'll get nothing out of 1TB. There are many real simulations with working sets in that range.
But what if the working set isn't 80 GB? If it's 8 GB, the RAM upgrade would be meaningless. If it's 800 GB, the upgrade to 128 GB would not help, but the upgrade to 1 TB would. If it's 8 TB, even 1 TB of RAM would not be enough.

It's very unlikely that the working set happens to be in the range where a mere 2x increase in RAM would give substantial benefits. If that happens in your application, good for you. But it's more likely that you need much bigger increases before you start seeing the benefits.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,474
California
The units are 2-based logarithms of the relevant quantity. For example, the M1 has 4 performance cores running at 3.2 GHz. Assuming 8 instructions/cycle and some marginal performance from the efficiency cores, we are at ~100 billion instructions per second, which is close enough to 2^37.

Making something use 10x more CPU time or RAM is often as easy as replacing a number with a bigger number. In many cases, that increase only yields marginal benefits. For example, by changing the fourth decimal in precision or recall.


But what if the working set isn't 80 GB? If it's 8 GB, the RAM upgrade would be meaningless. If it's 800 GB, the upgrade to 128 GB would not help, but the upgrade to 1 TB would. If it's 8 TB, even 1 TB of RAM would not be enough.

It's very unlikely that the working set happens to be in the range where a mere 2x increase in RAM would give substantial benefits. If that happens in your application, good for you. But it's more likely that you need much bigger increases before you start seeing the benefits.
Back in the day when I was writing EDA tools (and running them), I always seemed to need 10% more than available physical memory to hold the chip database. It was always like that. We’d always struggle to break up the design flow to keep it just under the thrashing threshold. A “mere” doubling of RAM is something that likely helps a lot of people.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Do we even know how much memory an M1-based (or maybe M2 by the time this is released) can address? iirc 16 gb was a hard limit for the M1, and the next levels may only be able to address 64 gb total.
64-bit addressing can access more than 18 exabytes (2 ^ 64, or a 20-digit number) of physical memory space. I don't think anyone buying a consumer system can afford that amount of RAM anytime soon? Probably need a power generator to keep it running.

It is up to Apple to decide how much physical memory each AS SoC can access. Every new address line added to the AS SoC doubles the amount of addressable physical memory. Other than Apple, nobody knows how many address lines are coming out of the AS SoC package, since there's no public documentation we can refer to. But it's a safe bet to assume that the AS Mac Pro must be able to support the same (or more) amount of RAM that the Intel Mac Pro supports.

I would think that the AS Mac Pro will have ECC DIMM slots.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.