Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

blackadde

macrumors regular
Dec 11, 2019
165
242
With the way the CPU space is moving it’s difficult to imagine someone in their right mind using Cascade Lake CPUs for bleeding edge memory intensive tasks in 15 (or even 5) years.
 
Last edited:
  • Like
Reactions: th0masp

Snow Tiger

macrumors 6502a
Dec 18, 2019
854
634
When I first laid my hands on a Cray-1 super computer back in late 1970s it had just 8 MB of memory; one million 64-bit words in Cray terms. At first we could only afford to rent 1/4 of this 8 MB while we developed additional code for the OS, and over several months we increased this to rent more and more until we accepted the Cray-1 with all of its 8 MB of memory. It cost our company in the region or 12 million dollars.

Today, with Clusters and their distributed memory the amount of memory available for tough complex problems needing huge amounts of memory is simply enormous and a lot less expensive.

"Alex , I'll take worst possible investments for the year 2020 , for 12 million , please ."

Alex , reading from card : "This thing is rusting away in the corner of your server room ."

Beep beep beep .

"Anyone ?"
 

throAU

macrumors G3
Feb 13, 2012
9,262
7,427
Perth, Western Australia
I can't think of any works using 1.5TB of RAM, especially with macOS. Can anyone explain and tell me which software and work require tons of RAM space up to 1.5TB with macOS?

Big data analysis

Not generally a typical mac pro workload, but hey, the CPU supports it (and the CPU is/may be used for exactly that in other environments), may as well add the slots for it.
[automerge]1579824307[/automerge]
One thing I wish is there were some RAM disk tools.

But SSDs cannot handle the super fast random accesses like RAM can. Sequential throughput is great, but lots of random read/writes are significantly slower than RAM.

So if you have a lot of random accesses, like for database work, having a giant RAM disk for the that, with some kind of write-out caching to a fast SSD would be pretty great.

It would be cool to have like a UPS battery backup for the motherboard itself too.

This is what OS caching is for, and it has existed for decades.

RAM disks suck, vs. a modern, intelligent cache. Why? Because before they work, you need to wait for the entire amount to be copied into RAM. You wanna wait for 1+ TB to read into memory off your SSD before you start work? Really?

And after you're done, you need to write any changes to disk before the power goes out. All you're doing is manually using RAM as cache with less intelligence/granularity and more complexity for the end user than a proper cache.

This is why we have disk caches. Or, if SSD isn't fast enough on initial read - optane. Most proper database also have their own configurable internal RAM cache size anyway (which it can use more intelligently than copying a bunch of potentially "cold" data into RAM via a RAM disk).


However, if you're still set on the ram disk idea, there's a company who does RAM SANs. But they're actually properly done, not some dinky OS ramdisk setup. Battery backup, etc. $$ tho
 
Last edited:
  • Like
Reactions: ZombiePhysicist

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,677
The Peninsula
I think the 1.5 GB capacity was designed into the system to future proof it for the next 15 years.
LOL

A Z8 has 24 DIMM slots and supports 3 TiB of RAM TODAY!

A DL580 has 48 DIMM slots and 6 TiB of RAM TODAY!

In fifteen years "1.5 GiB" [sic] of RAM will be as ridiculous as "nobody needs more than 640 KiB". Probably in three years.

For a snapshot from 14 years ago - 1 GiB RAM was a big thing!

2006.jpg
 
Last edited:
  • Like
Reactions: bxs

th0masp

macrumors 6502a
Mar 16, 2015
851
517
The way this and the original post are written seems to doubt that Macs have applications outside video/audio that benefit from having lots of RAM, even with the examples from users above. Apple's website actually advertises Mac Pro performance for Matlab, Mathematica, and development build times in addition to video/audio/photo. This shouldn't be surprising since Macs have long been popular in many scientific and technical fields, even before Apple switched to a Unix-based OS that gained mainstream application support, which made Macs a no-brainer. The latest Mac Pro just means there's now less of a need to use additional machines (cloud or local) for some high memory jobs as well.

I wasn't doubting that there are applications outside of those commonly associated fields - just that for these there's any real need for that kind of memory even for high end work for the next - say five - at least? - years. By which the thing will be outdated and made fun of.
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
.....

RAM disks suck, vs. a modern, intelligent cache. Why? Because before they work, you need to wait for the entire amount to be copied into RAM. You wanna wait for 1+ TB to read into memory off your SSD before you start work? Really?

And after you're done, you need to write any changes to disk before the power goes out. All you're doing is manually using RAM as cache with less intelligence/granularity and more complexity for the end user than a proper cache.

This is why we have disk caches. Or, if SSD isn't fast enough on initial read - optane.

If the RAM disk is the scratch disk then don't need to wait until a full load of data until start. A scratch disk filled with intermediate results is always going to have a initial copy overhead. RAM disk doesn't loose much there.

Likewise at the end. it is just intermediate data. It is either going into the trash can or tagged as 'final' and copied anyway. Same difference.

RAM disk with APFS allows snapshotting. So even if running 2-5 hours could allocate some threads to copy a snapshot to a more nonvolatile [ a snapshot coordinated by the app would likely get something highly coherent in that copy. Not necessarily going to end up with something coherent even if using nonvoliate storage if don't put the scratch intermediate data into a coherent state. ]

If the app is smarter about caching then yes adding more RAM to a task specific allocator is probably better. But the OS disk caches are also a "punt' to a more general mechanism and yet still present in all modern OS. ( Some apps can't handle > 1TB any more better than than can handle > 10-12 cores. )


There is still some narrower spaces useful for a host memory RAM disk. Optane drive a more cost effective in a larger group of cases than several years back. They have lower capacity limits than other storage media do though ( order of magnitude better than RAM but still much higher. )


All that even for disk caches. Need something in the 10% range to be highly effective on a wide variety worlkloads. So 1TB/.10 --> > 10TB data sets would be enough to warrant it. That isn't going to be a common active , concurrent data set for most of workflows the Mac Pro is likely to be put into, but there are some that are quite high.
 
Last edited:
  • Like
Reactions: ZombiePhysicist

mward333

macrumors 6502a
Jan 24, 2004
574
33
I can't think of any works using 1.5TB of RAM, especially with macOS. Can anyone explain and tell me which software and work require tons of RAM space up to 1.5TB with macOS?

I'm running my Mac Pro full throttle right now. I have 28 processes running day and night. Having 1.5 TB of RAM only translates to a little more than 50 GB per process. I'm running a mathematical research project. (I'm a professor.) The RAM in my program fluctuates, according to the portion of the space that I am investigating. The processes can routinely use this much RAM each; they are all running independently. Having 1.5 TB of RAM is definitely not overkill. This isn't simulation work either: it is all mathematical modeling. Indeed, we routinely run thousands of jobs like this on our computational clusters on campus. (I just finished another load of 3000 independent jobs on our clusters this morning.) The nodes on our clusters frequently have large amounts of RAM as well. Right now, I'm investigating a mathematical structure with almost 1 trillion attributes; each of these attributes corresponds to a long sequence itself, which needs to be calculated. For me, the Mac Pro is usually a testing environment for setting up large jobs to be run on our computational clusters, and/or an environment in which I can be sure to have a large, dedicated amount of RAM, to run processes that do not need to adhere to the 4-hour time window that is imposed on our computational clusters. I hope that is somewhat insightful.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.