Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bombardier10

macrumors member
Nov 20, 2020
62
45
Given that the 13900k has something like 300W peak power you are getting to the point where a home user is going to want a dedicated electrical circuit just for the PC, which is ridiculous.
You can easy plug and work a 2000 W grass mower into a regular outlet. If you have problems, change your electrician :D
 
  • Like
Reactions: aytan

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Between process improvements, newer core designs, & added hardware ray-tracing, I feel the M3-series of SoCs will yield a much more performant iGPU, especially in the M3 Extreme (or whatever it's called) configuration...

This covers displays/viewport performance, but a boost would still be needed for compute/render performance, enter the ASi GPGPUs...!

Single or Duo cards, each variant exceeding the compute/render performance of the M3 Extreme GPU subsystem, up to two Single or Duo cards in a Mac Pro, with an InfinityFabric-style interconnect between the two cards...

M3 Extreme GPU subsystem equal to latest Nvidia X090 card for compute/render, with up to 1TB ECC LPDDR5X RAM...

Single ASi GPGPU also equal to Nvidia X090, 1TB ECC LPDDR5X RAM...

Duo ASi GPGPU equal to two Nvidia X090, 2TB ECC LPDDR5X RAM total...

Two Duo cards equal to four Nvidia X090 cards, with a total of 4TB ECC LPDDR5X RAM, all running as one giant compute/render engine via the InfinityFabric-style interconnect...

Single ASi GPGPU has 64-core Neural Engine, Duo has 128-core Neural Engine, and two Duos gives 256-core Neural Engine...
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
The sweet spot for 4090 performance per watt is around 70% power limit for gaming so about -9.3% performance loss with -33.3% reduction in power from 450W to 300W. Same test needs to be performed for rendering but guestimation is if 4090 at 100% power takes 7 seconds to render Classroom scene then at 300W it's about 7.7 seconds so almost 3x faster than M2 Ultra that takes 22 seconds with a 295W max power consumption.

Powerscaling-scaled.jpg


1687344556739.png


1687346907176.png
 

aytan

macrumors regular
Dec 20, 2022
161
110
The sweet spot for 4090 performance per watt is around 70% power limit for gaming so about -9.3% performance loss with -33.3% reduction in power from 450W to 300W. Same test needs to be performed for rendering but guestimation is if 4090 at 100% power takes 7 seconds to render Classroom scene then at 300W it's about 7.7 seconds so almost 3x faster than M2 Ultra that takes 22 seconds with a 295W max power consumption.

Powerscaling-scaled.jpg


View attachment 2221272

View attachment 2221278
Thanks for the information, looks likeif you want to work on 3D 4090 has got the most value. Not a surprise for most of us.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
The sweet spot for 4090 performance per watt is around 70% power limit for gaming so about -9.3% performance loss with -33.3% reduction in power from 450W to 300W. Same test needs to be performed for rendering but guestimation is if 4090 at 100% power takes 7 seconds to render Classroom scene then at 300W it's about 7.7 seconds so almost 3x faster than M2 Ultra that takes 22 seconds with a 295W max power consumption.

You are confusing full system power with GPU power. At its peak, M2 Ultra GPU probably will consume less than 120 watts. In Blender, I suspect that the actual power consumption will be closer to 80 watts. Needs to be measured for a definitive statement. Same goes for Nvidia btw. I doubt that it will draw full 300W when rendering a Blender scene.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
You are confusing full system power with GPU power. At its peak, M2 Ultra GPU probably will consume less than 120 watts. In Blender, I suspect that the actual power consumption will be closer to 80 watts. Needs to be measured for a definitive statement. Same goes for Nvidia btw. I doubt that it will draw full 300W when rendering a Blender scene.
I think the CPU Max of the whole Mac Studio system vs just the dGPU part is a great way to compare Apple vs Nvidia products.

It really highlights how inefficient how Nvidia does things to achieve their raw performance.

It just needs to be emphasized that you're comparing a whole computer vs just a component.
 
  • Like
Reactions: sirio76 and aytan

jmho

macrumors 6502a
Jun 11, 2021
502
996
The fun part of that graph is that Apple is doubling their performance per generation for the same class of computer, while nVidia is only getting 30% despite the fact that the 4090 is a much bigger and far more expensive card than the 3090 was.

And Apple Silicon still hasn't played its RT core card yet which is going to provide an even more massive generational boost.

I can't wait to see what happens in the next couple of generations.
 

aytan

macrumors regular
Dec 20, 2022
161
110
Lets say whole PC with a 4090 consumes 500W and an M2 Ultra 150W. With this equation what is the output ?
Can we say at least in sake of power consumption/time consumption an M2 Ultra same with a 4090 PC or what ? I can not understand why we push to compare these two different animals.
just for the to make it clear I just want to understand these subject.
Why is that so important for owners who has got any of them or both of them ?
What is the final output ? I am really curious about it.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
You can easy plug and work a 2000 W grass mower into a regular outlet. If you have problems, change your electrician :D
Sure 2000 W is fine for brief periods but it's not recommended that you load a 15A breaker more than 80% load for persistent loads. It won't blow the breaker but it isn't recommended. Newer houses are being built with more 20A wiring but I don't know how widespread that is yet.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Can we say at least in sake of power consumption/time consumption an M2 Ultra same with a 4090 PC or what ? I can not understand why we push to compare these two different animals.
The camp rooting for 4090 will say power be damned. Only absolute power counts.

The camp rooting for AS will say performance per watt is important.

So I guess we’ll just have to pick our ”poison.”

It’s kind of pointless arguing who has the bigger gun tho. Use whatever works best for you.
 
  • Like
Reactions: sirio76 and aytan

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
You are confusing full system power with GPU power. At its peak, M2 Ultra GPU probably will consume less than 120 watts. In Blender, I suspect that the actual power consumption will be closer to 80 watts. Needs to be measured for a definitive statement.
Tally Ho Tech measured in M1 Ultra up to 120 W with GPU and up to 130 W with CPU+GPU.

The fun part of that graph is that Apple is doubling their performance per generation for the same class of computer, while nVidia is only getting 30% despite the fact that the 4090 is a much bigger and far more expensive card than the 3090 was.

And Apple Silicon still hasn't played its RT core card yet which is going to provide an even more massive generational boost.

I can't wait to see what happens in the next couple of generations.
I'm not sure that those Blender benchmarks reflect the true capability of the high-end GPUs. More complex scenes show that the nVidia 4090 got over 30% improvement.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Lets say whole PC with a 4090 consumes 500W and an M2 Ultra 150W. With this equation what is the output ?
Can we say at least in sake of power consumption/time consumption an M2 Ultra same with a 4090 PC or what ? I can not understand why we push to compare these two different animals.
just for the to make it clear I just want to understand these subject.
Why is that so important for owners who has got any of them or both of them ?
What is the final output ? I am really curious about it.

You are asking the right questions. Unfortunately, you are the minority ;)
 
  • Like
Reactions: aytan and diamond.g

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
Tally Ho Tech measured in M1 Ultra up to 120 W with GPU and up to 130 W with CPU+GPU.

M1 Ultra has scaling issues that prevented it from using full power so it shouldn't be compared with M2 Ultra that seems to have fixed the scaling so at similar 5nm node it should utilize higher percentage of max power. Too bad one can't just upgrade the M1 Ultra GPU but have to throw out the whole box. Or, on a deadline and need to scale to more than one GPU.

 

aytan

macrumors regular
Dec 20, 2022
161
110
as far as all I understand is more DCC and CPU produce facilities is going to develop ahead towards the kind of CPU+GPU SOC's on the same time people try to enable web base rendering solutions. We can observe this behavior by Octane look what they did last 2 years. Even Blender try to enable Evee Next for all kind of SOC's or GPU's as quick as possible. Maybe in the future Cycles, Redshift, Arnold ext. will be outdated or we could reach a point fast and efficient web base rendering solutions.
All this thing could be related upcoming AR/VR transformation for all. Every machine should have to be adapted this new environment without low power consumption with small sizes as could as possible. I don't know, maybe there is someone has insights about the subject.
Sure there will be a lot of people still have to use high end GPU's or enormous compute demands. This will take a decade at least I guess.
 

aytan

macrumors regular
Dec 20, 2022
161
110
M1 Ultra has scaling issues that prevented it from using full power so it shouldn't be compared with M2 Ultra that seems to have fixed the scaling so at similar 5nm node it should utilize higher percentage of max power. Too bad one can't just upgrade the M1 Ultra GPU but have to throw out the whole box. Or, on a deadline and need to scale to more than one GPU.

I agreed with you, M2 Ultra owners are happy now :).
I m going to use M1 Ultra for a while, at some point I will buy a M2/3/4/5 Ultra and M1/2/3/4 Ultra will be a great Video Editing machine at the post production house and will serve long years. It is a cycle which will shaped by upcoming jobs.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
I'd really love to see people start daisy-chaining M2 Ultra Studios as a sort of makeshift render-farm.

Running 8x 4090s is a pretty crazy undertaking when thinking about power and heat, but 8x Studios is pretty doable.

You'd also likely get much better scaling with 8x Studios since there would be no bus / memory contention to deal with.

I wonder how many Studios you'd need in order to beat 8x 4090s. 20? 30? :D
 

aytan

macrumors regular
Dec 20, 2022
161
110
I'd really love to see people start daisy-chaining M2 Ultra Studios as a sort of makeshift render-farm.

Running 8x 4090s is a pretty crazy undertaking when thinking about power and heat, but 8x Studios is pretty doable.

You'd also likely get much better scaling with 8x Studios since there would be no bus / memory contention to deal with.

I wonder how many Studios you'd need in order to beat 8x 4090s. 20? 30? :D
If there is a little hope for daisy chaining I m sure I will go for it with a couple of Max and Ultra. Wish this could happen in my life time, not soon in a far far away galaxy...
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
I'd really love to see people start daisy-chaining M2 Ultra Studios as a sort of makeshift render-farm.

Running 8x 4090s is a pretty crazy undertaking when thinking about power and heat, but 8x Studios is pretty doable.

You'd also likely get much better scaling with 8x Studios since there would be no bus / memory contention to deal with.

I wonder how many Studios you'd need in order to beat 8x 4090s. 20? 30? :D

8 x 300W = 2400W spread across two circuit breakers so 1200W each and people claim 4090 at 300W is almost silent so with good ventilation and spacing it sounds doable. And, pretty sure PCIe 4.0 x16 is quite a bit faster than 10Gb ethernet but since you volunteered it'll be interesting to see the final M2 Ultra render farm count. Apple might even sponsor you if you do a YouTube video on it.
 

aytan

macrumors regular
Dec 20, 2022
161
110
I'd really love to see people start daisy-chaining M2 Ultra Studios as a sort of makeshift render-farm.

Running 8x 4090s is a pretty crazy undertaking when thinking about power and heat, but 8x Studios is pretty doable.

You'd also likely get much better scaling with 8x Studios since there would be no bus / memory contention to deal with.

I wonder how many Studios you'd need in order to beat 8x 4090s. 20? 30? :D
are you suggesting connect that studios by ethernet or proper daisy chaining over TB ports and use second third ext. machines only for GPU's which are slaved to 1 st ? I tried to point out second imaginery solution
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
8 x 300W = 2400W spread across two circuit breakers so 1200W each and people claim 4090 at 300W is almost silent so with good ventilation and spacing it sounds doable. And, pretty sure PCIe 4.0 x16 is quite a bit faster than 10Gb ethernet but since you volunteered it'll be interesting to see the final M2 Ultra render farm count. Apple might even sponsor you if you do a YouTube video on it.
What would you need to be sending over 10Gb ethernet after the initial load?

Let's say you want to render a 10 frame animation on 10 Mac Studios. You have your scene on a shared drive, each Mac Studio reads the scene into memory (sure, over 10Gb ethernet). Then Mac Studio 1 renders frame 1, Mac Studio 2 renders frame 2. Congratulations you've now rendered 10 frames in the time it takes 1 Mac Studio to render 1 frame. Perfect 10x scaling. 100 Studios would give you 100x scaling etc.

(Obviously this is a very simple load balancing solution and you'd probably want a better one in practice)

Compare that to the PC where even if you've managed to load your scene into the VRAM of every single card, your single CPU is going to be sending commands back and forth over PCIe continuously to each GPU constantly, meanwhile with the Mac you've got 10 CPUs - 1 CPU per 1 GPU, so you're never going to get CPU bottlenecked.
 
  • Haha
Reactions: aytan

jmho

macrumors 6502a
Jun 11, 2021
502
996
are you suggesting connect that studios by ethernet or proper daisy chaining over TB ports and use second third ext. machines only for GPU's which are slaved to 1 st ? I tried to point out second imaginery solution
The nicest solution would be TB with a first party interface to manage load balancing for you.

The least nice solution would be to not even bother connecting them and just load balance your scene manually - for example if you have 2 studios and want to render a 1000x500 image just render the left 500x500 pixels on Studio 1, and the right 500x500 pixels on Studio 2.
 
  • Like
Reactions: aytan
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.