Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jujoje

macrumors regular
May 17, 2009
247
288
You need to join the maxon forum to see. It’s free. The op in Arstechnica mentioned a Facebook group. I don’t use it so can’t point you to it.

https://redshift.maxon.net/topic/31258/moana-island-scene/27

One thing I was wondering is that redshift on M1 makes significant gains by increasing the bucket size. I'm wondering whether the blender benchmarks are hampered by being tuned to more traditional GPUs in terms of memory cache / bucket size. Pretty much all the people testing it just load the scene and press go...


Well, if you do 3D stuff, then they def gonna use multiple GPU so in reality, Apple Silicon wont be able to compete with them. PC can put 4x RTX 3090 while Mac Studio has only one GPU. I hope Mac Pro allow multiple graphic cards with 128 GPU cores tho.

4x3090 GPU would be terrible value for money, pretty ineffiencent as the scaling isn't linear, and you'd probably be better off buying cpus. It would also require a stupid amount of power to run.

Out of curiosity, does anyone know how long it takes for that GPU to do the same thing in Windows?

I think there's a thread in the Redshift forums on this if your interested.
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
4x3090 GPU would be terrible value for money, pretty ineffiencent as the scaling isn't linear, and you'd probably be better off buying cpus. It would also require a stupid amount of power to run.
It's not replaceable. Or maybe just use a rendering farm with tons of GPU.

Beside, if you are serious about 3D, then it doesn't really matter. They want results, not power consumption.
 
Last edited:

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
This was weird-- I opened this thread and there was no first post... I figured out why, clicked the link to see all the posts and then remembered why I didn't see it in the first place. Statements like this:
using Apple's funded and co-developed multithreaded Blender

I'm guessing this was a really hard spin on this:

5 months ago, Apple announced they're donating to a development fund and providing expertise, and Blender announced that Mac is "a supported platform again". There's no mention of "co-development", they're making their engineers available. It's not Apple funded, they're putting money in a pool. 5 months ago Mac wasn't even considered a supported platform on this non-commercial project and now it's being discussed as a reference benchmark?

Not worth the energy anyone is putting into discussing it.
 

jujoje

macrumors regular
May 17, 2009
247
288
It's not replaceable. Or maybe just use a rendering farm with tons of GPU.

To be honest that's kind of my feeling about rendering that kind of scale; you want a machine that fits the everything into memory and is fast to the first pixel (which is where the Mac Studio fits in). Then for final frames send to a farm either in the cloud or locally. At the end of the day artist time is more expensive than farm time, so you don't really want final frames on workstations (although for lighter scene that would work for overnight renders).
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
5 months ago, Apple announced they're donating to a development fund and providing expertise, and Blender announced that Mac is "a supported platform again".
Blender didn't have the knowledge or the money to develop a Metal backend for its rendering engine.

There's no mention of "co-development", they're making their engineers available. It's not Apple funded, they're putting money in a pool.
Apple financially supports Blender by donating man hours of at least two of its engineers.

it's being discussed as a reference benchmark
Maybe not now, but it will be a good benchmark in the future because all GPU manufacturers are improving Blender performance on their hardware.
 

mi7chy

macrumors G4
Original poster
Oct 24, 2014
10,623
11,295
4x3090 GPU would be terrible value for money, pretty ineffiencent as the scaling isn't linear, and you'd probably be better off buying cpus. It would also require a stupid amount of power to run.

CPUs are not only much slower but also don't scale well due to limited CPU sockets with at most four sockets. 1x 3090 is about double the performance of 2x Xeon 8260 CPUs that cost about 4x or more. Also, 3090 GPUs are cheap versus paying a whole production team to wait around.

Blender 3090 GPU scaling:
8xGPU 13s 1.62x
4xGPU 21s 1.86x
2xGPU 39s 1.95x
1xGPU 1m16s

 

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
Blender didn't have the knowledge or the money to develop a Metal backend for its rendering engine.


Apple financially supports Blender by donating man hours of at least two of its engineers.


Maybe not now, but it will be a good benchmark in the future because all GPU manufacturers are improving Blender performance on their hardware.

Right. All of this just points out why this thread is nothing more than a distraction. There are plenty of head to head comparisons that are useful now, not in the future. There's no reason to "extrapolate" ahead 2 days based on a bad reference benchmark. Why go down the rabbit hole of an application that doesn't have the knowledge or money to build an actual Mac application unless the goal is to distract from actual representative benchmarks?
 

JimmyjamesEU

Suspended
Jun 28, 2018
397
426
Right. All of this just points out why this thread is nothing more than a distraction. There are plenty of head to head comparisons that are useful now, not in the future. There's no reason to "extrapolate" ahead 2 days based on a bad reference benchmark. Why go down the rabbit hole of an application that doesn't have the knowledge or money to build an actual Mac application unless the goal is to distract from actual representative benchmarks?
Indeed.
 
  • Like
Reactions: Analog Kid

Homy

macrumors 68030
Jan 14, 2006
2,507
2,459
Sweden
So now we’re in a situation where it’s no longer good enough to beat the best gpu, it has to beat an infinite amount. Seems reasonable.

Exactly this! ? No matter how well Apple Silicon will perform and how power efficient it will be there always will be a bigger fish to compare it to just to prove their point. Soon they will be comparing Mac Pro with 2-4 M1 Ultra to GPU farms, Fujitsu Fugako with 7,630,848 cores or a fusion reactor.

Funny that the people who complain about how expensive Mac Studio is are the same people that suddenly seem to have unlimited amount of money/electricity to spend on 8 RTX 3090 cards and Xeon 8260, each for about $2250.

Blender 3090 cost scaling, only GPU/CPU included:

8xGPU -63s $18,000 ($22,500 with dual Xeon 8260 in the video)
4xGPU -55s $9,000 ($13,500 with dual Xeon 8260 in the video)
2xGPU -37s $4,500 ($9,000 with dual Xeon 8260 in the video)
1xGPU 1m16s $2,250 ($6,750 with dual Xeon 8260 in the video)

The fastest Xeon 8260 48c (96 threads) has a multi-core score of 32734 in Geekbench 5. M1 Ultra 20c has 24315.
 

theotherphil

macrumors 6502a
Sep 21, 2012
899
1,234
Exactly this! ? No matter how well Apple Silicon will perform and how power efficient it will be there always will be a bigger fish to compare it to just to prove their point. Soon they will be comparing Mac Pro with 2-4 M1 Ultra to GPU farms, Fujitsu Fugako with 7,630,848 cores or a fusion reactor.

Funny that the people who complain about how expensive Mac Studio is are the same people that suddenly seem to have unlimited amount of money/electricity to spend on 8 RTX 3090 cards and Xeon 8260, each for about $2250.

Blender 3090 cost scaling, only GPU/CPU included:

8xGPU -63s $18,000 ($22,500 with dual Xeon 8260 in the video)
4xGPU -55s $9,000 ($13,500 with dual Xeon 8260 in the video)
2xGPU -37s $4,500 ($9,000 with dual Xeon 8260 in the video)
1xGPU 1m16s $2,250 ($6,750 with dual Xeon 8260 in the video)

The fastest Xeon 8260 48c (96 threads) has a multi-core score of 32734 in Geekbench 5. M1 Ultra 20c has 24315.

Yeah, just 2 years ago the same people were saying a phone processor couldn’t handle the workload of desktop applications. And here we are, having to compare a phone processor against Octa RTX3090’s or top end, server grade CPU’s….hilarious.
 

jujoje

macrumors regular
May 17, 2009
247
288
CPUs are not only much slower but also don't scale well due to limited CPU sockets with at most four sockets. 1x 3090 is about double the performance of 2x Xeon 8260 CPUs that cost about 4x or more. Also, 3090 GPUs are cheap versus paying a whole production team to wait around.

Blender 3090 GPU scaling:
8xGPU 13s 1.62x
4xGPU 21s 1.86x
2xGPU 39s 1.95x
1xGPU 1m16s


Honestly, never trust an Nvidia presentation; you can call Apples performance benchmarks cherry picked, but Nvidia ones would make Pravda proud.

Still going by the bit of the presentation you linked to he is rendering the classroom test scene, which is pretty unrepresentative these days of the kind of scene you'd be wanting to render on that level of GPU power (pretty basic geometry, shaders and lights and would probably fit in the memory of a graphing calculator). I mean it's kinda cool you can render it that fast if you spend a lot more money I guess? Feels a bit like an artificial use case to win at numbers more than anythings else.
 

mi7chy

macrumors G4
Original poster
Oct 24, 2014
10,623
11,295
Honestly, never trust an Nvidia presentation; you can call Apples performance benchmarks cherry picked, but Nvidia ones would make Pravda proud.

Still going by the bit of the presentation you linked to he is rendering the classroom test scene, which is pretty unrepresentative these days of the kind of scene you'd be wanting to render on that level of GPU power (pretty basic geometry, shaders and lights and would probably fit in the memory of a graphing calculator). I mean it's kinda cool you can render it that fast if you spend a lot more money I guess? Feels a bit like an artificial use case to win at numbers more than anythings else.

It's a system integrator demo and not from Nvidia. It's meant to be short otherwise no YouTube viewer is going to sit through a 15 min render. Audience it's marketed to already knows what one GPU is capable of so they just want to scale up to 8x. Not for first time users.
 
  • Haha
Reactions: MayaUser

jujoje

macrumors regular
May 17, 2009
247
288
It's a system integrator demo and not from Nvidia. It's meant to be short otherwise no YouTube viewer is going to sit through a 15 min render. Audience it's marketed to already knows what one GPU is capable of so they just want to scale up to 8x. Not for first time users.

I just saw the green and looked at the benchmarks from where you bookmarked it, saw iffy looking benchmarks and assumed it was an Nvidia demo :p

Doesn't really change the point that this test is pretty much worthless for stressing those GPUs in any way. Perhaps the audience it is marketed to is those like benchmarking light weight scenes with basic GI on their $18,000 machines?

Out of idle curiosity, the machine that Bizon offers with a 32 core Threadripper, 3090 24GB, 128Gb ram and a 1Tb SSD comes to $9,636. That $5,799 Mac Studio is looking pretty good value now.

Perhaps we just have very different ideas of what represents a professional workload here; for me that would be something with heavy volumes, millions of polygons and complex shaders. For you an empty classroom.

But less facetiously, the standard Blender benchmarks aren't particularly representative. You're free to argue the toss, but something like ALabs, Moana or the Disney vdb dataset all offer far more indicative representations of production assets and potentially far more useful results.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
Don’t waste your time on this topic... leave this user alone, eventually like any other T.... it will get bored and stop posting nonsense.
 

iBug2

macrumors 601
Jun 12, 2005
4,540
863
I'm still waiting for comparisons between Mac Studio and this
Unknown.jpeg
 

Sopel

macrumors member
Nov 30, 2021
41
85
Out of idle curiosity, the machine that Bizon offers with a 32 core Threadripper, 3090 24GB, 128Gb ram and a 1Tb SSD comes to $9,636. That $5,799 Mac Studio is looking pretty good value now.

I offer one with the same spec for $19,335. In light of that Mac Studio is insane!!!
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
The fun thing about the Redshift benchmark is that the only reason the 3090 is much much faster than the 3080 is because the scene is larger than the 10GB of VRAM that the 3080 has, which causes a lot of swapping - even with multiple GPUs, and since NVLink is incredibly low bandwidth (especially compared to UltraFusion) you can't just split the scene across each GPU (but you could with UltraFusion, if that was necessary, which it isn't)

Plus the scene is also too large for either nVidia card to use OptiX / RT cores with.

This means that if you wanted to reaaaaaally push the envelope and make a scene with say ~100GB of assets, a single M1 Ultra would probably beat 8x 3090s.

The Mac Pro is probably going to be an absolutely incredible machine at the high end for stuff like Pixar / Disney quality stuff. nVidia cards are still better for hobbyists though.
 

MauiPa

macrumors 68040
Apr 18, 2018
3,438
5,084
Blender also runs on Linux and is said to be even faster so what does it have to with Windows? And, multi-GPU support is built into Blender.
I don't use Blender, but I did read they just finally came out with a real apple silicon optimized version using Metal. Is that the version you are using? The other versions perhaps run, but can't be said to be really optimized. A bit like comparing to see how OpenCL (deprecated for years) runs. Who cares?
 

mi7chy

macrumors G4
Original poster
Oct 24, 2014
10,623
11,295
M1 Ultra 20CPU 64GPU 128GB

CPU on par with 12600K
GPU half of laptop 70W 3060

The Tech Chap
1647531920294.png
 

AmazingTechGeek

macrumors 6502a
Mar 6, 2015
685
304
Los Angeles
Yes sadly true, not many 3d apps that are well optimized and well games are kinda non existent and most not optimized for M1 either.
M1, PS5, Xbox Series all have architecture that sets a paradigm shift in software design. M1 has big potential to cater to PC gaming, but the API's and current engine's have yet to be optimized for the chip series unlike PS5/Xbox where those tools are derivative from PS4/Xbox One SDK/GDKs.

It will take time, but it viable to have bring a significant portion of PC users over to give macOS a new market in AAA quality gaming.

I work in the gaming industry and this a question that is starting to be asked in my studio. Not something we will be exploring, but we understand the potential of the M1 Max/Ultra.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.