Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,302
19,284
Not exactly the performance bump I was hoping for with the M3 Max, but it is faster and more importantly it is much faster at most of the stuff I actually need to do; I live in After Effects a lot more than any 3D apps.

I think this is a sign that the software is not well optimized. M3 Max has more FP throughout, more cache, and more memory bandwidth. Or maybe the bottleneck is elsewhere.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
I think this is a sign that the software is not well optimized. M3 Max has more FP throughout, more cache, and more memory bandwidth. Or maybe the bottleneck is elsewhere.
Considering that the fellows who are now the developers have only had the source code for less than a year; the fact we have a native Apple silicon version so soon is fairly remarkable.

I was really close to finding something else, but as I've said before; I just like the way Lightwave works over most other packages.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,509
945
Apple is scheduled to add MetalFX upscaling and OpenSubdiv support in Metal to Blender this year.
 

sirio76

macrumors 6502a
Mar 28, 2013
571
405
Redshift now support M3 RT hardware acceleration.
According to RS benchmark now the top of the line Mac laptop is about 3 time slower than a desktop RTX4090, in theory an M3 Ultra should be close enough to it.
 
  • Like
Reactions: aytan and PaulD-UK

aytan

macrumors regular
Dec 20, 2022
159
109
Redshift now support M3 RT hardware acceleration.
According to RS benchmark now the top of the line Mac laptop is about 3 time slower than a desktop RTX4090, in theory an M3 Ultra should be close enough to it.
M3 Max MBP Redshift demo scene result is 3:16 right now ( in the middle of 3070ti and 3080 render time ), it was around 4:20 last week before last Redshift M3 HW upgrade ( from Redshift forums ).
With perfect scaling M3 Ultra could reach same render time or better than 4080 which is 1:47 or worst case better than 4070ti which is 2:10 ( I think M3 Ultra with 128 GB or more Unified Memory will be critical for comparison with 4080 or 4090 ). We will see when it becomes available.
 
  • Like
Reactions: Lone Deranger

jujoje

macrumors regular
May 17, 2009
234
272
Well, looks like no render man for AS. From the release docs:

  • macOS 10.15 through 13.X. Apple Silicon is only supported with Rosetta 2. Intel version of DCCs required.

Bit disappointing given Pixar's history (and that pretty much every other major renderer support AS).
 

richinaus

macrumors 68020
Oct 26, 2014
2,384
2,140
M3 Max MBP Redshift demo scene result is 3:16 right now ( in the middle of 3070ti and 3080 render time ), it was around 4:20 last week before last Redshift M3 HW upgrade ( from Redshift forums ).
With perfect scaling M3 Ultra could reach same render time or better than 4080 which is 1:47 or worst case better than 4070ti which is 2:10 ( I think M3 Ultra with 128 GB or more Unified Memory will be critical for comparison with 4080 or 4090 ). We will see when it becomes available.
I think my set up is perfect for rendering - a 14" M1 MacBook Pro for portability [between studio and home]plus PC render box next to me in the studio, utilising Parsec. This means I keep on working on my macbookpro whilst all the heavy lifting is external on the 4090, and I can check it anytime with Parsec [from anywhere].

I don't need to worry about if Apple are going to release whatever anymore as I can just upgrade my GPU [future proofed the PC with decent power supply, PCMIE 5 etc]

Also have the same set ups with studio displays at the studio and home so it is all plug and play.

This set up works very well, and was done when I realised Apple will always be behind PC's for GPU, at the announcement of the M chips.
 
  • Like
Reactions: jinnyman and aytan

aytan

macrumors regular
Dec 20, 2022
159
109
Well, looks like no render man for AS. From the release docs:



Bit disappointing given Pixar's history (and that pretty much every other major renderer support AS).
Yes disappointing :( I have a hope for this release...
 

bombardier10

macrumors member
Nov 20, 2020
56
41
Nvidia RTX 4090 is near five times faster than M2 Ultra graphic chip (3D rendering)...
Here is video with Cinebench 2024 benchmark . RTX 4090 reach 35000 points
while M1 Ultra only 5900. Apple best chip doesn't even come close to the performance of RTX 4090.
Next generations of Mxxx ultra chip that increase performance by 20% will not change much here.

 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
M3 Max is at 12,676 on Cinebench. So about 3x slower with the entire system below the wattage that only the 4090 by itself would generate. Not so bad considering that is a laptop.

It would be interesting to see what Apple could achieve if they separate the GPU from the main part of the SoC part.

These types of comparisons are getting kind of long in the tooth; even though I would love to run a 4090 in my 2019 Mac Pro under OSX. Just isn't ever going to happen.

My Cinebench image attached for M3 Max 128GB.

Cinebench2024-M3Max.png
 

leman

macrumors Core
Oct 14, 2008
19,302
19,284
Nvidia RTX 4090 is near five times faster than M2 Ultra graphic chip (3D rendering)...
Here is video with Cinebench 2024 benchmark . RTX 4090 reach 35000 points
while M1 Ultra only 5900. Apple best chip doesn't even come close to the performance of RTX 4090.
Next generations of Mxxx ultra chip that increase performance by 20% will not change much here.

I agree that Apple will have difficulty catching up to Nvidia in these kinds of applications, I am not sure it is entirely hopeless though. Just a few points to consider:

- M3 Max is already as fast as M2 Ultra for rendering. That's 2x jump in a single generation. I'd say that Apple is on good trajectory here

- Cinebench GPU renderer does not seems to be particularly well optimized for Apple GPUs, e.g. in Blender 4.1 the difference between M3 Max and 4090 is "only" 3x in Nvidia's favor. Which is a fairly impressive result for Apple IMO since we are comparing a 50-60 watt GPU with nominal 13 TFLOPs to a 450W behemoth with 82 TFLOPs.

- Apple has an advantage for more complex scenes that require more memory, since it has a larger "fast" GPU memory pool than most Nvidia GPUs

Apple could build a faster GPU than a 4090 with the technology they have today, it's the question of cost and utility. There are some low-hanging fruits they can pursue to dramatically improve the performance of their GPUs. The big issue is going to be memory bandwidth.
 

bombardier10

macrumors member
Nov 20, 2020
56
41
Apple could build a faster GPU than a 4090 with the technology they have today, it's the question of cost and utility
I very much doubt it. Costs even absurd have never been a problem for Apple . Technology is a team game. If you can't be the best at something then you use solutions that are already ready. And Apple silicon technology is neither cheap nor efficient for desktop computers. Integration of CPU and GPU in a single chip must have failed. MacPro or Studio owners to increase graphics performance must buy a new unit. Soon we will be replacing computers like smartphones every two years (
 

Regulus67

macrumors 6502
Aug 9, 2023
380
369
Värmland, Sweden
And Apple silicon technology is neither cheap nor efficient for desktop computers.
Apple doesn't have many desktop computers anymore. And they are not updated as often as the laptops.

The mac mini haven't got the M3 yet, even if the M1 was launched with the first Apple Silicon machines.
There is only one iMac version left, the 24", and it skipped one generation.
The mac Studio haven't got the M3 either, thus far.

As far as I can tell, there is no pattern in the releases for the desktops
 

leman

macrumors Core
Oct 14, 2008
19,302
19,284
I very much doubt it. Costs even absurd have never been a problem for Apple . Technology is a team game. If you can't be the best at something then you use solutions that are already ready. And Apple silicon technology is neither cheap nor efficient for desktop computers. Integration of CPU and GPU in a single chip must have failed. MacPro or Studio owners to increase graphics performance must buy a new unit. Soon we will be replacing computers like smartphones every two years (

I certainly would agree that the GPU performance in the current high-end Macs is not convincing. This might change in the future products though. You are right that there are scalability issues with the current approach, which is why Apple is actively exploring alternatives. For example, this is from a recent patent of theirs:

1712864371111-png.28988
 

ader42

macrumors 6502
Jun 30, 2012
426
378
If only Apple would release a chip for the MacPro that does take 450Watts while still being as efficient as their current offerings, we might get quiet from the poor boring windows lovers for a while.

In the meantime ZBrush runs fantastic on regular Apple Silicon, but then I don’t merely render porn garbage with other peoples models in/from Daz Studio like so many PC “3d artists”.

Personally I spend less than 1% of my time rendering so rendering speed matters not much to me.
 
  • Haha
  • Like
Reactions: avkills and MRMSFC

sirio76

macrumors 6502a
Mar 28, 2013
571
405
Personally I spend less than 1% of my time rendering so rendering speed matters not much to me.
It’s not just you, 90% of the artists have a mixed workflow where rendering take only a small part, but there will be always people (quite often people that don’t even work in 3D or are just hobbyist or gamers) that fell superior because meaningless benchmarks tells them they can render faster.

The reality is that today for most 3D artists every modern/decent machine is more than enough to produce great contents in a reasonable time and generate revenue, the thing that matter the most is the artist skill and you can not buy that.

The only real limitation might be if you need a software that strictly requires Windows/CUDA.
 
  • Like
Reactions: iPadified

MRMSFC

macrumors 6502
Jul 6, 2023
341
352
but there will be always people (quite often people that don’t even work in 3D or are just hobbyist or gamers) that fell superior because meaningless benchmarks tells them they can render faster.
Reminds me of the joke about people who spend beaucoup bucks building monster PCs to end up playing Stardew Velley.
 

Boil

macrumors 68040
Oct 23, 2018
3,286
2,899
Stargate Command
It’s not just you, 90% of the artists have a mixed workflow where rendering take only a small part, but there will be always people (quite often people that don’t even work in 3D or are just hobbyist or gamers) that fell superior because meaningless benchmarks tells them they can render faster.
Reminds me of the joke about people who spend beaucoup bucks building monster PCs to end up playing Stardew Velley.

Or only really taxing their CPU & GPU resources when running benchmarks for epeen bragging rights...
 
  • Like
Reactions: vel0city

richard371

macrumors 68040
Feb 1, 2008
3,634
1,820
Nvidia RTX 4090 is near five times faster than M2 Ultra graphic chip (3D rendering)...
Here is video with Cinebench 2024 benchmark . RTX 4090 reach 35000 points
while M1 Ultra only 5900. Apple best chip doesn't even come close to the performance of RTX 4090.
Next generations of Mxxx ultra chip that increase performance by 20% will not change much here.

I have the 14" MBP m3max 16/40 and its fast. Picked up a Corsair vengeance i7500 with 14900 and GTX 4090. I bought it for some gaming especially MSFS 2020. Its way faster than the Mac in 3d tests etc. OpenCL in geekbench 6 is 3x faster. Lightroom denoise AI takes 20 seconds for a 50mp raw on my M3max which is fast. On the 4090 it takes 5 seconds. Crazy.
 

leman

macrumors Core
Oct 14, 2008
19,302
19,284
I have the 14" MBP m3max 16/40 and it’s fast. Picked up a Corsair vengeance i7500 with 14900 and GTX 4090. I bought it for some gaming especially MSFS 2020. It’s way faster than the Mac in 3d tests etc. OpenCL in geekbench 6 is 3x faster. Lightroom denoise AI takes 20 seconds for a 50mp raw on my M3max which is fast. On the 4090 it takes 5 seconds. Crazy.

That is a much smaller difference than expected. The 4090 nominally is over 6x faster in GPU compute and at least an order of magnitude faster in ML.
 

Homy

macrumors 68020
Jan 14, 2006
2,137
1,994
Sweden
Maxon has a dedicated page for it saying it will come this year.

"ZBrush for iPad is undergoing a complete redesign, meticulously crafted from the ground up to optimize every feature for an unparalleled sculpting experience on the iPad."


 

vinegarshots

macrumors 6502a
Sep 24, 2018
947
1,310
Maxon has a dedicated page for it saying it will come this year.

"ZBrush for iPad is undergoing a complete redesign, meticulously crafted from the ground up to optimize every feature for an unparalleled sculpting experience on the iPad."



The marketing departments for these companies must be on extended vacation or something. What the in the hell was that? A 23 second video of nothing, just to see a QR code at the end? 🤣
 

Homy

macrumors 68020
Jan 14, 2006
2,137
1,994
Sweden
The marketing departments for these companies must be on extended vacation or something. What the in the hell was that? A 23 second video of nothing, just to see a QR code at the end? 🤣

It's a teaser I think since there's no release date, but I agree that it felt more like an ad for iPad than ZBrush. :)
 
  • Like
Reactions: Xiao_Xi

ader42

macrumors 6502
Jun 30, 2012
426
378
Yeah, I noticed straight away that the UI was different and from your screengrab thay have obviously put a lot of thought into it.

The big question will be what features will they leave out to start with?

I can see a 13” iPad Pro with M4, 16GB RAM/1TB and Magic Keyboard with Apple Pencil Pro in my future…
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.