Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

don.vito

macrumors newbie
Original poster
Aug 15, 2019
1
0
I am curious how you can justify say 1.5 terrabytes of RAM among so many other little qualifiers that make it seem like a bespoke computer for nasa. I dont even think 8k video requires more than 64GB of RAM.

For people more knowledgable about this product than me could you explain some real use cases in a variety of industries and how this power will give them computing advantages?
 
I am not even in the industry and I can tell you how I would use it.

RAM Disk. I use a 32Gb ram disk all the time. Love it.
 
Compiling from a RAM disk!
Running multiple various OS configurations in VMs simultaneously to catch bugs
Scentific simulations of very large datasets (weather models, etc)
 
Fluid dynamic simulation with non-homogenous fluid that allows for vapor cavitation.

Multiple dynamic stress-strain analysis, including frictional heating from the dynamic stresses and calculating the resulting temperature increase’s effect on material characteristics over multiple cycles with varying periods of time.

Basically, I can stress test to failure without having to use $$$$ materials or spending $$$$ in machining and fabrication. It’s also days or weeks faster per test cycle. For that matter, larger parts, assemblies, and modules aren’t really even possible to test in reality.

This is where I would use it.
 
  • Like
Reactions: ftbtx and reukiodo
Slash: Your just showing off now with all those big words.......:p

To original OP: The very high specs also provide many years of expansion.
Looking back at PPC macs and my G4 powerbook had max 1.25gb ram and 32mb gpu.......specs are changing rapidly, 1.5tb today (for most people) may seem excessive but in 5 years it may not be enough.
 
Exactly what those $8k–$150k Win10 Autodesk Maya rendering stations that have been in use for nearly 4 years now.

As Apple told us in the TechCrunch articles, it's for bleeding edge audio and video apps that didn't yet exist. The customer is the film and animation industry.

Logic runs about 150 tracks of VIs with plugins on each track. The 10 core iMac Pro runs 300. At the Keynote, Apple showed the 7.1 running 1,000 — the update having been released the week before.

I don't know if FCPx will get a similar update or if Apple will debut new apps.

Otherwise, it's

https://www.apple.com/mac-pro/
 
  • Like
Reactions: Selsk
You could open 10 tabs in Chrome with 1.5TB of RAM. :p

But seriously, all valid points in this thread. I'll add another one, our database research group is currently working with in-memory databases, so whatever is usually written to a disk is stored in-memory here. That's where lots and lots of RAM comes in handy.

Datascience projects have been mentioned, but depends on the use-case. I'm taking delivery of a new Dell box this week for my Deep Learning projects. I need GPUs to run my stuff, so I got 4x V100 SMX2 NVIDIA GPUs with a total of 128GB on the GPUs. I need regular RAM for a few things (really not that much) and ended up with 384GB RAM in the box. You can see someone who relies in RAM will require much more.
 
My rig? It isn't that bad. It's a rack server, goes for about $125k. The rest is up to having a good contract with the supplier. We (University, so we do teaching and research) buy a lot of hardware all the time, so if you stick to a single manufacturer for the big boxes, they will usually offer a good contract with excellent discounts.

I'm pretty sure if you place an order for 100 top-spec Mac Pros, Apple will do the same.
 
I am curious how you can justify say 1.5 terrabytes of RAM among so many other little qualifiers that make it seem like a bespoke computer for nasa. I dont even think 8k video requires more than 64GB of RAM.

For people more knowledgable about this product than me could you explain some real use cases in a variety of industries and how this power will give them computing advantages?

64Gb Ram is the minimum you need to edit in 6-8K, I think the requirement also go up the more cores you have. So I'd personally start with 96GB

Q. Will it matter if you don't have the other 6 slots populated in terms of Speed or is this an old thing ala 2008 Mac Pros?
[doublepost=1566206781][/doublepost]
The end of any need for Vienna Ensemble Pro.

so true, but too much latency anyway and way cheaper on power, cables and computers just get one big beast to handle everything!
 
RAM is the fastest form of storage, an application has access to (cache is faster, but normally an apllication can't control that). If it has to be fast, you don't want to use an mass storage device (not even an SSD) as swap space.
So, 1.5 TB of RAM is future-proofing the MacPro.
 
RAM is the fastest form of storage, an application has access to (cache is faster, but normally an apllication can't control that). If it has to be fast, you don't want to use an mass storage device (not even an SSD) as swap space.
So, 1.5 TB of RAM is future-proofing the MacPro.

RAM is the fastest storage but it is also the most expensive storage. The more money tossed into RAM leaves a smaller budget for other stuff. A 1TB (or more ) RAM can cost as much as the whole rest of the system.

1.5TB isn't really the best path to future proofing if currently have 100GB sized workload. If future workload RAM data footprint grew at 25% per year for 10 years the future data footprint would be less than 950GB. If grew at 10% per year for 10 years it would be less than 300GB.

Intel charges a very stiff tax on processors that go over the 1TB limit. The new Mac Pro is only going to offer processors with that tax included. If over the service lifetime of the working set size never crosses the 1TB limit then that extra 'tax' just bought a whole lot of nothing.


There are lots of workloads that are growing at less than 25% per year. And if fastest matters then the rest of the system in 5-6 will be faster also. So if fastest matters substantive parts of the system will need to be refreshed in the future.

There are a much smaller set of workloads where the RAM workload footprint is reasonably close to going over 1TB in the next handful of years at the current rate. In that context (the non nebulous distant future) , it is more a rational forecast than a "so big I'll be under the limit. " proof of the future.
[doublepost=1566223669][/doublepost]
I am not even in the industry and I can tell you how I would use it.

RAM Disk. I use a 32Gb ram disk all the time. Love it.

Even if doubled the RAM disk to 64GB and used 5x that amount for application working space still at 'only' 384GB of RAM. Only 38.4% of the way to 1TB RAM footprint.

And yet would be paying the over 1TB RAM tax due to Apple's choices on BTO processors ( if buy the processors from them. ).. Apple's selection is going to make more money about as much as it will be useful for a wide variety of historical macOS workloads.
 
Last edited:
RAM is the fastest storage but it is also the most expensive storage. The more money tossed into RAM leaves a smaller budget for other stuff. A 1TB (or more ) RAM can cost as much as the whole rest of the system.

1.5TB isn't really the best path to future proofing if currently have 100GB sized workload. If future workload RAM data footprint grew at 25% per year for 10 years the future data footprint would be less than 950GB. If grew at 10% per year for 10 years it would be less than 300GB.

Intel charges a very stiff tax on processors that go over the 1TB limit. The new Mac Pro is only going to offer processors with that tax included. If over the service lifetime of the working set size never crosses the 1TB limit then that extra 'tax' just bought a whole lot of nothing.


There are lots of workloads that are growing at less than 25% per year. And if fastest matters then the rest of the system in 5-6 will be faster also. So if fastest matters substantive parts of the system will need to be refreshed in the future.

There are a much smaller set of workloads where the RAM workload footprint is reasonably close to going over 1TB in the next handful of years at the current rate. In that context (the non nebulous distant future) , it is more a rational forecast than a "so big I'll be under the limit. " proof of the future.
Yes, today most people will not order the 1.5 TB. But it's a nice checkbox to check for a long term investment.
Sure, RAM is surely most expensive storage, but it doesn't matter if your workload requires 1.5 TB, you will be glad, you have a machine, which capable of handling this much memory and you will pay that money gladly.
All you are saying is: Nobody needs that much memory. That's where Bill Gates also went wrong.
 
Exactly what those $8k–$150k Win10 Autodesk Maya rendering stations that have been in use for nearly 4 years now.

As Apple told us in the TechCrunch articles, it's for bleeding edge audio and video apps that didn't yet exist. The customer is the film and animation industry.

Logic runs about 150 tracks of VIs with plugins on each track. The 10 core iMac Pro runs 300. At the Keynote, Apple showed the 7.1 running 1,000 — the update having been released the week before.


4 years ago there wasn't a mainstream priced desktop processor than had more than 4 cores. In September, there will be a 16 core one that is less than $900 in price. Intel will have a 10 core ( and probably less than $900) priced solution by mid 2020. One reason the Mac Pro is going "up" in spec chasing is that where it was (during first five iterations) is now being covered by other parts of the Mac line up.

As for Apple's 1,000 VI demo, that was mainly a demo of stuff 5-6 Avid HDX cards into a single box. And that is somewhat more about having over 90 DSP processors in the box, than on the Apple provided stuff. Those cards happen to have relatively low bandwidth requirements so it works out the slot bandwidth allocation. 5 x $3K card .... $15K.
Apple's 1,000 track demo audio focused also used Pro XDR display which was spectacle worthy too.

The Tech Crunch articles did not talk about non existent apps. The 2018 one talked about directed, incremental improvements to FCPX and LogicX. The improvements didn't exist before they worked on them but the apps did.
 
4 years ago there wasn't a mainstream priced desktop processor than had more than 4 cores.
Depends on your definition of mainstream — but that doesn't matter because mainstream isn't an issue here.

The $150k Maya rendering station has 56 cores, 1TB RAM and 8TD SSD. I'm surprised that the 7.1 isn't being offered 56 core but Apple is using a less expensive 28 core CPU. Although it specs the same in most applications, ganging with another to create 56 cores has been disabled. This was done to force server farms to go with the more expensive unit. If the performance on Autodesk Maya is the same with the 28 core 7.1, then it will be a bargain.
As for Apple's 1,000 VI demo, that was mainly a demo of stuff 5-6 Avid HDX cards into a single box. And that is somewhat more about having over 90 DSP processors in the box, than on the Apple provided stuff. Those cards happen to have relatively low bandwidth requirements so it works out the slot bandwidth allocation. 5 x $3K card .... $15K.
Nope, it was Logic. 1,000 VIs with plugins on each track. Essentially the same setup that runs 300 tracks easily in an iMP 10 core and chokes a 6.1 12 core at 150 tracks. I didn't see a bunch of AVID cards connected to the demo machine at the WWDC.
[doublepost=1566228421][/doublepost]
The end of any need for Vienna Ensemble Pro.
Yea, they don't mention VEP by name but they might as well have. Plogue Bidule, too. Good riddance.
 
This reminds me of the Power Macintosh 9500 with its 12 RAM slots, could be maxed at 1.5GB of RAM, in times when most computers only shipped with around 4MB of RAM. That system can run 10.5 fairly well with an upgraded CPU and GPU, but most other systems of that generation can't even run OS X.

I imagine in the far future that this Mac Pro could easily run whatever OS X versions come along when 2TB of RAM is standard, as we now look back on the days when 4MB was standard and 2GB of RAM seemed like an insane amount.
 
  • Like
Reactions: bjar and JoSch
I'd hope they would, but could you imagine it being anywhere near close to what Dell/HP do?
Even ordering mid to lower spec Dell Precisions, the discounts from Dell were quite considerable.
Dell discount is the same for us, no matter what we order as long as it's server stuff, precision workstation or mobile workstations. For Apple, we're getting 20% off on computers, mobile and desktop, no matter what it is or how many we order. This is in line with the old Apple developer hardware discount. And while that is far from what Dell offers, if you order more, Apple or a reseller will probably give more discount.

Before I retired from the industry and switched to "university life" I used to offer full Mac solutions for the software I developed and if a client ordered 10 desktops and 10 MacBooks, plus some iPads, the reseller would usually offer between 30% and 40% discount, depending on configuration and displays, accessories, etc. So if you place an order for 100 Mac Pros with top-spec, I'd expect 40% to 50% discount. That is still not quite what Dell offers, but pretty good and in line with my experience from a couple of years ago.
 
I'd wager universities would need them. I know this for a fact. When I was in postgrad many years ago, we had server-grade dell computers that took weeks to run simulations. W/ humongous amount of ram, those simulations would drastically take less time.

Pcie 4.0 w/ 7gbps reads/writes ssds and xeons w/ 56 cores slated for next year, I'm torn whether to spend the money this year or next year. Even if I wait, there's no guarantee that apple will update the mac pro to the latest and the greatest. Mac pro doesn't get yearly refreshes like the laptops and imacs. My 6-year-old pc is due for an upgrade and I'm back and forth b/w the mac pro and imac pro.
 
My 6-year-old pc is due for an upgrade and I'm back and forth b/w the mac pro and imac pro.
Again, brilliant ad placement....

ad2.jpg
 
Then clearly, you didn't read the second one.

I have read it. Multiple times ( about every time someone pops up and swears that it says something that it do not I at least skim of not re-read it again). I haven't though read "into" it what I want to be there. I read to comprehend what's been said.

It is extremely telling that all you have is some variation on an ad hominem attack ( a failure upon my ability to read and comprehend ) as opposed to a quote from the article. The closest part to a handwaving references to something non existent and future on products is

"... But the Pro Workflow Team isn’t just there to fix current bugs. It’s also empowered to make improvements on future products, like the Mac Pro. ..."
https://techcrunch.com/2018/04/05/apples-2019-imac-pro-will-be-shaped-by-workflows/

Which the first part is telling because the focus of the article has been about how one of this teams primary missing is getting more performance and less bugs out of current applications. However, "improvements on future products" is inconsistent with apps that doesn't exist yet. Future versions of apps/hardware that don't exist yet perhaps not there is no skew toward "audio and video apps that didn't yet exist". The next version of LogicX doesn't exist yet but LogicX does exist.

This group is primarily looking at Apple stack, but they are looking at a subset of 3rd party stuff.

"... we find it and we go into our architecture team and our performance architects and really drill down and figure out where is the bottleneck. Is it the OS, is it in the drivers, is it in the application, is it in the silicon, and then run it to ground to get it fixed.” ...
...He stresses that it’s not just Apple’s applications that they’re testing and working to help make better. Third-party relationships on this are very important to them and the workflow team is helping to fix their problems faster too. ...
"

The outside consultants they hire to do directed projects are working on things that probably need to be completed.

"... They sit doors away from the engineering team running through real footage and mixing real tracks to figure out what’s working and what’s not. And they use a mixture of software, not just Apple’s first-party stuff. ..."


But it is a huge leap that all of these projects require only a Mac Pro. (especially when the contracted for projects probably needed to be done completed by a certain point in time). The "Pro Workflow" group generally works on things that better enable the whole product line. A bug fix is going to help on any Mac. Some chokepoint in a system library is probably a chokepoint on multiple Macs.

While they do work with other folks stuff, it is probably delusional to think that the folks selected don't significantly use at least some Apple apps significantly in their workflow. Apple isn't going to pay someone alot of money to primarily sit around and optimizing AVID , Adobe , or someone else's stuff. If the bug/throttle point is in someone else's code (or someone else's hardware) Apple really brings nothing special to the solution path in those cases. Where there is a nexus point of stuff primarily owned by apple that's the subset that this workflow team with most likely spend most of their time. That is where they can make a difference.

Some new application spinning up where most of the bugs/changes need to be made in 3rd party code... there is little impact that team is going to have in that context.
 
Again, brilliant ad placement....
....

Google and other ad placements are often upon what you ( the individual page reader) has browsed. (perhaps sometimes on the page content.). It isn't particularly brilliant if you browser history is driving it. Nor is ti particularly aligned with forum guidelines to post ads in the main content stream.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.