Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Codpeace

macrumors regular
Original poster
May 13, 2011
160
101
NYC
Hey, so I’m an enthusiast and super excited about the new Mac Pro. I’m reading all I can about it bc it’s fascinating to me how much creative folk (who are my clients; I’m a lawyer who advises artists, filmmakers, musicians, etc.) are craving this machine. I have no need for a machine like this — but I still think it’s cool and want to learn more.

One thing I’d love some insight on is: what would someone do, that is how would someone make good use of, such a vast amount of RAM as this machine can accommodate, like 1.5tb? What possibilities does it present? How do you “fill” it productively?

I can imagine things like multiple virtual drives for scratch work, or virtual machines running various software on multiple OS platforms, but there must be more exotic yet efficient ways to use those resources than just speed-scaling, right? I’m particularly interested in these capabilities as might be used by creative professionals (as opposed to, say, bankers or bitcoin miners).

I get how multiple cores and greater CPU and GPU capacity can be useful (though comments on that would be really welcome, for me to learn), but how increases in processor power can/does integrate with massive RAM is beyond my current understanding.

Thanks ?? for anything you’re willing to share!
 

arock

macrumors member
Apr 29, 2005
70
66
My use case is audio production. Sample-based virtual instruments can be fully loaded into RAM for lowest latency playback. For example, you send MIDI note data to a virtual violin and it plays the correct sample for that note. These libraries can be massive since each note could be one of many different samples - legato, staccato, pizzicato, etc. My orchestral sampled instruments have over 1TB of samples on disk. If I can keep the most commonly used 150-200 GB in memory, that is a huge win for realtime performance/playback.
 

bsbeamer

macrumors 601
Sep 19, 2012
4,313
2,713
Creating a RAM disk is/was fairly common several years ago, but with the speeds achieved with RAID'd NVMe, the "need" for 1TB+ RAM for this use scenario has trailed off a lot.

Servers with multiple VM's can also utilize tremendous amounts of RAM.

RAM previews are common in After Effects. I've never personally been on machine with more than 384GB and can only imagine it stores much more of that in the cache, allowing faster previews, faster workflows, and faster renders (if working 1:1).

The scenario with 1.5TB RAM with MP7,1 requires the 24 or 28 core processor. If you're getting into that setup, you can also likely run multiple instances of applications (or multiple instances of a renderer) in parallel, if they cannot utilize each core effectively natively. Then your calculation is splitting the total RAM to individual cores or pairs of cores (for this purpose). Well written applications do not require this at all.

A lot of the stuff from ILM, Maxon, Autodesk, and eventually Avid and Adobe will (hopefully) take better advantage of this potential from machine in the future. You're not purchasing a max'd machine today for your needs today. If you are, you're probably doing something wrong. This is more about future expandability. Let's hope the lineage (like MP1,1>MP5,1) continues and there are frequent updates available, swappable parts, etc.
 

Codpeace

macrumors regular
Original poster
May 13, 2011
160
101
NYC
My use case is audio production. Sample-based virtual instruments can be fully loaded into RAM for lowest latency playback. For example, you send MIDI note data to a virtual violin and it plays the correct sample for that note. These libraries can be massive since each note could be one of many different samples - legato, staccato, pizzicato, etc. My orchestral sampled instruments have over 1TB of samples on disk. If I can keep the most commonly used 150-200 GB in memory, that is a huge win for realtime performance/playback.
Very cool, that's exactly the kind of thing I was interested in. I can imagine a virtual pipe organ, for example, that would demand massive amounts of RAM to accommodate the various manuals, ranks of pipes, etc.

This might be a technical hardware question, but is all memory accessible at once? (Perhaps that is why 24-core+ processors are required for really huge memory.) I would imagine not, that as with everything digital it's sequential, but probably the latency in this process is so low that humans can't perceive it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.