Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
Now more than ever, we no longer need a desktop computer to have multiple graphics cards or terabytes of memory because of virtualization and cloud based compute resources. If you truly need absurd computation you can just run it remotely, you don't need 4 graphics cards in your desktop.
Sure, you can run large compute jobs on cloud resources, but how do you get your data there and back?

The network is the limiting factor here. Try uploading and downloading terabytes of data on your typical 50-500Mbps broadband network and you'll quickly find a good reason for fast local storage and networks.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
AWS is investing in enticing studios to move to the cloud.

Working from home seem to have changed companies' workflows.
Your post nicely illustrates a key limitation of the simplistic "just use the cloud" solution. As your link explains, AZ found that, for many key applications, in order for users to get the performance they need (in particular, for latency to be reduced to acceptable levels), the servers and clients need to be located in the same city.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
Wouldn't it be more about the number of cores than the number of cards? The M1 Ultra is just a number of smaller M1 chips stitched together.
I dont think they gonna just increase the core numbers instead of re-using M1 Max or Ultra chips.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Sure, you can run large compute jobs on cloud resources, but how do you get your data there and back?
Why would you move your data back and forth in the cloud? Moving your data back is not only slow, but also very expensive.

The network is the limiting factor here. Try uploading and downloading terabytes of data on your typical 50-500Mbps broadband network and you'll quickly find a good reason for fast local storage and networks.
for many key applications, in order for users to get the performance they need (in particular, for latency to be reduced to acceptable levels), the servers and clients need to be located in the same city.
If you need a very fast Internet connection, you can physically connect your connection to that of your cloud server provider. For instance, AWS offers this service:
 

altaic

macrumors 6502a
Jan 26, 2004
711
484
Why would you move your data back and forth in the cloud? Moving your data back is not only slow, but also very expensive.
Are you proposing that people put their data in the cloud for computation and distribute the result via the cloud without ever verifying it themselves?
 
  • Like
Reactions: singhs.apps

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Are you proposing that people put their data in the cloud for computation and distribute the result via the cloud without ever verifying it themselves?
No, you need to verify the results.

Cloud computing is more convenient when you can create and modify your data on servers, and reduce the amount of data you need to retrieve.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
No, you need to verify the results.

Cloud computing is more convenient when you can create and modify your data on servers, and reduce the amount of data you need to retrieve.
You are totally ignoring the security issue if you put datas on server or cloud which defeats the purpose of using workstation. Also, you still need to transfer tons of file to the server so it's a same thing.
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Let’s make an actual example. Redshift is a 3d renderer that is used for stills and animation sequences. It is a critical component for motion graphics ( there are alternatives like Octane but that is not in a good shape for macs right now)
On my mbp 16 max the basic benchmark renders in 10 minutes. The m1 ultra in about 7 mins ( bad scaling).
On my pc with a single 3090, it renders in 2:30 and on my old mbp with 6900 in egpu in about 5 mins iirc.
Dual 6900 in egpus scale well. Inside a MP even better so a dual 6800/6900 solution inside a mp will give you an acceptable solution that is just half as bad as a pc for similar investments. That is an acceptable tradeof for using macos for me. But today the 4090 will be announced and later this year the amd 7000 series, both doubling perf.
So a a single gpu pc will be rendering the bench in about 1:30 and a dual gpu in sub 1 min. Meanwhile, if apple releases a macpro with a m2 extreme, we will at best be in single 3090 land, probably a lot worse. That is just not competitive. So, apple really needs to have something up their sleeve…
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
Let’s make an actual example. Redshift is a 3d renderer that is used for stills and animation sequences. It is a critical component for motion graphics ( there are alternatives like Octane but that is not in a good shape for macs right now)
On my mbp 16 max the basic benchmark renders in 10 minutes. The m1 ultra in about 7 mins ( bad scaling).
On my pc with a single 3090, it renders in 2:30 and on my old mbp with 6900 in egpu in about 5 mins iirc.
Dual 6900 in egpus scale well. Inside a MP even better so a dual 6800/6900 solution inside a mp will give you an acceptable solution that is just half as bad as a pc for similar investments. That is an acceptable tradeof for using macos for me. But today the 4090 will be announced and later this year the amd 7000 series, both doubling perf.
So a a single gpu pc will be rendering the bench in about 1:30 and a dual gpu in sub 1 min. Meanwhile, if apple releases a macpro with a m2 extreme, we will at best be in single 3090 land, probably a lot worse. That is just not competitive. So, apple really needs to have something up their sleeve…
Agreed, but part of this, as your post suggests, is a problem with Redshift not yet being fully optimized for AS ( https://www.reddit.com/r/RedshiftRenderer/comments/tnpjqy ). If it were, the M2 Extreme still wouldn't be competitive, but it would at least far exceed a 3090 (should be closer to a single 4080 Ti which, yes, is still pretty weak for a machine in the Mac Pro's likely price range, but it's at least better than a 3090).
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
What security issues do cloud service providers have?
The cloud itself is not safe. Even Apple had many issues on cloud services which is from AWS. Do you really wanna put important datas on cloud to use cloud computing? What if datas leaks by accident? Hackers is a great example. That's the reason why Mac Pro 2019 has encrypted SSD with T2 chip. This is also why you dont put important datas on cloud.
 
  • Like
Reactions: singhs.apps

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
The cloud itself is not safe.
All Internet, not only the cloud, is unsafe. What makes you believe that on-premise servers are more secure than the cloud? What is the weakness of cloud security?

That's the reason why Mac Pro 2019 has encrypted SSD with T2 chip.
You can get that security on the cloud too. You can encrypt data at rest and in transit at any cloud service provider.
 
  • Haha
Reactions: sunny5

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
What this whole argument misses is that, just because something can be done in the cloud doesn't mean people actually want to do it there. If you really wanted to do stuff in the cloud rather than locally, all you'd need would be the most minimal of laptops, so long as it had sufficiently fast Ethernet and the ability to drive as many monitors as you need. And there'd be no point in getting MacOS, since you're working in the cloud, so you don't see your OS, you see the cloud's OS.

But that's not how things work. People want, and pay for, laptops, desktops, and workstations sufficiently powerful to enable them to complete their tasks locally.

I'm not a video pro, but I do some of my work in the cloud, and it's always less nice than doing it locally (e.g., compare using the web-based version of Outlook to the locally-installed client; the difference is night-and-day). Plus whenever I need to use a cloud-based app I have to login, typically with two-factor authentication. So if I need to use it five times each day, that's five logins with authentication. With the cloud there's more delay, more friction, and I don't get to work within my customized MacOS environment. In practice, I do my development work locally, and then send large jobs to a server for processing. That doesn't mean I don't want and need a fast machine for my local work.

Thus, for many of us, the common-sense setup is to have a fast machine for local work, combined with the cloud for large jobs that require more processing power than can reasonably be obtained from a single laptop/desktop/workstation.
 
Last edited:

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,837
1,706
All Internet, not only the cloud, is unsafe. What makes you believe that on-premise servers are more secure than the cloud? What is the weakness of cloud security?


You can get that security on the cloud too. You can encrypt data at rest and in transit at any cloud service provider.
No you cant. Even Amazon had several security issues. It's a common sense that storing important datas on cloud is a huge mistake and problem. The local storage is way safer and faster. Like I said, the internet speed is a huge bottleneck for using cloud compute to deal with mass files which already contradict the purpose of cloud. So how will you gonna deal with tons of file through internet which is way slower than internal SSD?
 

vinegarshots

macrumors 6502a
Sep 24, 2018
982
1,349
Agreed, but part of this, as your post suggests, is a problem with Redshift not yet being fully optimized for AS ( https://www.reddit.com/r/RedshiftRenderer/comments/tnpjqy ). If it were, the M2 Extreme still wouldn't be competitive, but it would at least far exceed a 3090 (should be closer to a single 4080 Ti which, yes, is still pretty weak for a machine in the Mac Pro's likely price range, but it's at least better than a 3090).

I really doubt that "full optimization" for AS would be enough to even get it close to a 3090, let alone a 4080 TI. Nothing that I've seen since AS launched has given me any confidence of its utility for achieving competitive performance with anything Nvidia will have available in the foreseeable future.
 
  • Like
Reactions: AAPLGeek

Kimmo

macrumors 6502
Jul 30, 2011
266
318
Pure curiosity before Mac Pro shows up later this year. Let's say a new chip's name for Mac Pro is M2 Extreme.

1. M2 Extreme will be a powerful chip but not in GPU because many workstations already can use more than 1 graphic card or can be used up to 4 or more. Mac Pro 2019 supports up to 4 graphic cards. Which means a new Mac Pro needs up to 4x M2 Extreme chips. If you are fine with just one M2 Extreme, then that's fine but how about those people who need more than just one graphic card? Are you gonna say you dont need more than 1 graphic card lol. There are reasons why many workstations support multiple GPU and you never know if Mac Pro becomes a beast in 3D market.
Adding an external GPU might be the option but Apple Silicon take advantages from unified memory and SoC so I doubt it.

2. The maximum RAM size is in question. With only one M2 Extreme, it might have up to 256gb or maybe up to 384gb based on M2's memory. Well, that's way lower than Mac Pro's maximum RAM size which is 1.5TB. Others can support up to 4TB. Even if they can use 4x M2 Extreme, it's barely 1.5TB. Yes, that's only IF they use 4 chips. And Yes, they might be able to add a lot of memory chips along with one M2 Extreme but can they really add those memories around the chip?

3. Will Mac Pro supports PCIe slots? What about upgradability and expandability? Currently, Apple is very hostile toward both area even for Mac desktops. If Apple wants Mac Pro to be another Mac Studio, what's the point? So far, I'm quite skeptical that Apple is willing to support PCIe slots and other upgradable parts.

I have no idea how AS Mac Pro will be look like but it shouldn't be another Mac Studio
I'm getting the sense that you might not be the only skeptic.

Personally, I'm hopeful, but we'll see. :)

What this whole argument misses is that, just because something can be done in the cloud doesn't mean people actually want to do it there. If you really wanted to do stuff in the cloud rather than locally, all you'd need would be the most minimal of laptops, so long as it had sufficiently fast Ethernet and the ability to drive as many monitors as you need. And there'd be no point in getting MacOS, since you're working in the cloud, so you don't see your OS, you see the cloud's OS.

But that's not how things work. People want, and pay for, laptops, desktops, and workstations sufficiently powerful to enable them to complete their tasks locally.

I'm not a video pro, but I do some of my work in the cloud, and it's always less nice than doing it locally (e.g., compare using the web-based version of Outlook to the locally-installed client; the difference is night-and-day). With the cloud there's more delay, more friction, and I don't get to work within my customized MacOS environment. In practice, I do my development work locally, and then send large jobs to a server for processing. That doesn't mean I don't want and need a fast machine for my local work.

Thus, for many of us, the common-sense setup is to have a fast machine for local work, combined with the cloud for large jobs that require more processing power than can reasonably be obtained from a single workstation.
Absolutely right.

Adobe keeps trying to push me into the cloud for critical work, but that's not where I want to do it.

When I finish a job locally, and am completely satisfied with the quality, that's the time to move the project to outside servers.
 

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I really doubt that "full optimization" for AS would be enough to even get it close to a 3090, let alone a 4080 TI. Nothing that I've seen since AS launched has given me any confidence of its utility for achieving competitive performance with anything Nvidia will have available in the foreseeable future.
I'm basing what I wrote on this, where I'm using single-precision (FP 32) performance as a general measure of GPU compute, since it eliminates optimization as a factor. Where specifically do you disagree with my numbers?

If anything, using this metric to assess M-series performance relative to NVIDIA seems fairly conservative, since it puts the M1 Ultra at the same level as the 3070 Ti, and I don't know of anyone who argues it's slower than that.

Specifically:
If:
a) The ratio of M2 Ultra: M1 Ultra GPU performance is the same as that of M2:M1
b) The M2 Extreme is 2 x M2 Ultra
c) These projections for the NVIDIA 4000 series are correct
Then:
The M2 Extreme's general GPU compute performance, as measured by FP 32 TFLOPS, should be about on the level of the 4080 Ti.

TFLOPS, SINGLE-PRECISION (FP 32)
M1: 2.6
M2: 3.6
M1 MAX: 10.4
M2 MAX: 14 (?) (EXTRAPOLATING FROM M2/M1 x M1 MAX)
4050: 14 (?) (entry-level, ~$250?)
M1 ULTRA: 21
3070 TI: 22
M2 ULTRA: 29 (?) (EXTRAPOLATING FROM M2/M1 x M1 ULTRA)
3080: 30
4060: 31 (?) (entry-level, ~$330?)
3080 TI: 34
3090: 36
3090 TI: 40
4070: 43 (?) (mid-level, ~$550?)
4080: 49 (?)
4080 TI: 56 (??) (EXTRAPOLATING FROM 4080 x 3080 TI/3080)
M2 EXTREME: 58 (?) (EXTRAPOLATING FROM 2 x M2 ULTRA)
4090: 83 (?)
4090 TI: 92 (??) (EXTRAPOLATING FROM 4090 x 3090 TI/3090)
M2 2X EXTREME: 116 (?)

Projections for the 4000-series are based on online articles like this one:

1663704246164.png
 
Last edited:
  • Like
Reactions: Xiao_Xi

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
No you cant.
What can't you do - encrypt your data at rest and in transit? Show me an AWS service that doesn't allow you to encrypt your data at rest and in transit.

What this whole argument misses is that,
I thought the discussion was whether an Apple Silicon based Mac Pro is technically feasible and whether it makes financial sense to do so. If companies move to the cloud, workstations make less sense. In fact, PC workstations are slowly dying. For instance, Intel doesn't sell workstation CPU anymore.

4070: 43 (?) (mid-level, ~$550?)
4080: 49 (?)
4080 TI: 56 (??) (EXTRAPOLATING FROM 4080 x 3080 TI/3080)
M2 EXTREME: 58 (?) (EXTRAPOLATING FROM 2 x M2 ULTRA)
4090: 80 (?)
4090 TI: 89 (??) (EXTRAPOLATING FROM 4090 x 3090 TI/3090)
M2 2X EXTREME: 116 (?)
You can update the Nvidia's RTX 40 Series data.
 

belvdr

macrumors 603
Aug 15, 2005
5,945
1,372
So how will you gonna deal with tons of file through internet which is way slower than internal SSD?
Depends on a lot of variables, but for many, you don't bring it on-premise. Leave it in the cloud. For others, you download infrequently. Again, lots of variables that are specific to each environment. If you require frequent file movement, then it's likely not a good candidate for you.
 

dawnrazor

macrumors 6502
Jan 16, 2008
424
314
Auckland New Zealand
I think you have to look at the Pro user that Apple Wants to use the Mac Pro… so the people they are targeting rather than the entire pro industry. Apple does not care about 3D workstations that have been the domain of PCs for years. They will target high end Editors, high end Colourists, high end Flame artists, high end photographers studios. High profile individuals, not render farms -where the MP will take centre stage in a suite and be the hero of the room. Not to say other people won’t buy a Mac Pro, they will of course, but like the Mac Studio, Apple have a clear idea of their end user and will tailor the experience to that niche crowd and others will either follow or stick with their PCs… Apple didn’t pander to the PC crowd with the 2019MP, they just built a powerful expandable machine to severe the market at the time… this time around I don’t see it being different, it will have to be expandable, other wise it’ll just be a mega Mac Studio.
 

vinegarshots

macrumors 6502a
Sep 24, 2018
982
1,349
I'm basing what I wrote on this, where I'm using single-precision (FP 32) performance as a general measure of GPU compute, since it eliminates optimization as a factor. Where specifically do you disagree with my numbers?

If anything, using this metric to assess M-series performance relative to NVIDIA seems fairly conservative, since it puts the M1 Ultra at the same level as the 3070 Ti, and I don't know of anyone who argues it's slower than that.

Specifically:
If:
a) The ratio of M2 Ultra: M1 Ultra GPU performance is the same as that of M2:M1
b) The M2 Extreme is 2 x M2 Ultra
c) These projections for the NVIDIA 4000 series are correct
Then:
The M2 Extreme's general GPU compute performance, as measured by FP 32 TFLOPS, should be about on the level of the 4080 Ti.

TFLOPS, SINGLE-PRECISION (FP 32)
M1: 2.6
M2: 3.6
M1 MAX: 10.4
M2 MAX: 14 (?) (EXTRAPOLATING FROM M2/M1 x M1 MAX)
4050: 14 (?) (entry-level, ~$250?)
M1 ULTRA: 21
3070 TI: 22
M2 ULTRA: 29 (?) (EXTRAPOLATING FROM M2/M1 x M1 ULTRA)
3080: 30
4060: 31 (?) (entry-level, ~$330?)
3080 TI: 34
3090: 36
3090 TI: 40
4070: 43 (?) (mid-level, ~$550?)
4080: 49 (?)
4080 TI: 56 (??) (EXTRAPOLATING FROM 4080 x 3080 TI/3080)
M2 EXTREME: 58 (?) (EXTRAPOLATING FROM 2 x M2 ULTRA)
4090: 83 (?)
4090 TI: 92 (??) (EXTRAPOLATING FROM 4090 x 3090 TI/3090)
M2 2X EXTREME: 116 (?)

Projections for the 4000-series are based on online articles like this one:

View attachment 2073494

TFLOPS don't tell the whole story. For example, Nvidia hardware with its ray tracing cores, Optix, etc. That stuff creates huge performance advantages that Apple doesn't offer a solution for...
 
  • Like
Reactions: Xiao_Xi

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
I thought the discussion was whether an Apple Silicon based Mac Pro is technically feasible and whether it makes financial sense to do so.
If you reread the original post, you'll see the OP was asking if the AS Mac Pro will be competitive with other workstations. Others are trying to invalidate that question, and thus derail the entire discussion, by saying either everyone is moving into the cloud for heavy-duty workstation tasks (a false claim), or should do so (a questionable opinion), and thus it's irrelevant how powerful the AS Mac Pro is. My statement was in response to that.

Indeed, it seems you're attempting to invalidate this discussion yourself:
In fact, PC workstations are slowly dying. For instance, Intel doesn't sell workstation CPU anymore.
I'm not saying this isn't a legitimate issue to raise—but if you want to do so, you and others who want to question whether workstations are valid at all should start your own thread to discuss that topic; don't try to to throw a wrench into this one.

[And you'll see from the reaction below that OP agrees with me.]

You can update the Nvidia's RTX 40 Series data.
Already did so a few minutes prior to your post.
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,880
3,060
TFLOPS don't tell the whole story. For example, Nvidia hardware with its ray tracing cores, Optix, etc. That stuff creates huge performance advantages that Apple doesn't offer a solution for...
That's certainly true, but that cuts both ways. You can always find system-specific tasks that one or another system excels at, like ML and RT* for NVIDIA, or being able to process multiple 8k video streams for AS (the Ultra can do 18 simultaneously at 30 FPS).

And if, in your claim that an M2 Extreme (= 2 X M2 Ultra) AS Mac Pro wouldn't even match the 3090, you had said you meant for RT or for ML, then I wouldn't have had an issue. But that's not the statement you made:
I really doubt that "full optimization" for AS would be enough to even get it close to a 3090, let alone a 4080 TI. Nothing that I've seen since AS launched has given me any confidence of its utility for achieving competitive performance with anything Nvidia will have available in the foreseeable future.
Instead, you claimed that an M2 Extreme Mac Pro wouldn't get close to a 3090 broadly and without qualification, which simply doesn't seem likely to be true.


[*There's informed speculation that Apple will introduce hardware RT with the 3 nm M3, since RT takes up a lot of die space, and the process shrink will allow the room for it.]
 
Last edited:
  • Like
Reactions: Xiao_Xi and sunny5
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.