Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Matty_TypeR

macrumors 6502a
Oct 1, 2016
641
555
UK
There is not one M2 ultra in the top 100 wildlife scores. top score 4090 = No1 = 272537 number 100 = 194708 and no M2 ultra's in the wildlife hall of fame.

any links to these scores from the M2?
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
There is not one M2 ultra in the top 100 wildlife scores. top score 4090 = No1 = 272537 number 100 = 194708 and no M2 ultra's in the wildlife hall of fame.

any links to these scores from the M2?
No ideas, I wasn't making any claims about Wildlife Extreme. Someone previously said the m2 ultra was equivalent to the 3090 in 3dMark at much lower power usage and I guess it would have to be a comparison in 3dMark wildlife or Wildlife Extreme.

Edit: it was @jeanlain

In any case, those top marks are for wildly overclocked results. i.e. massive liquid cooled rigs for bragging rights.
 

Matty_TypeR

macrumors 6502a
Oct 1, 2016
641
555
UK
i cant find any M2 ultra scores in wildlife, and nothing in wildlife extreme. only thing in wild life extreme is an M1 Max with a score of 10017 so if the M2 ultra does 5 X that at 50.000 say, which i dout, the 4090 with a 13900K scores average 92.500 but over clocked can get 107.594 in the same test wild life extreme.


I just downloaded the IOS version of wild life extreme on IOS and looked at the IOS hall of fame, cant find any M2 listed. will keep an eye on that, as i dont think M2 ultra is going to be any where near a 4090.
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
i cant find any M2 ultra scores in wildlife, and nothing in wildlife extreme. only thing in wild life extreme is an M1 Max with a score of 10017 so if the M2 ultra does 5 X that at 50.000 say, which i dout, the 4090 with a 13900K scores average 92.500 but over clocked can get 107.594 in the same test wild life extreme.


I just downloaded the IOS version of wild life extreme on IOS and looked at the IOS hall of fame, cant find any M2 listed. will keep an eye on that, as i dont think M2 ultra is going to be any where near a 4090.
No one has claimed 3dmark would beat or be close to the 4090. Claim was it would be approx 3090 which seems accurate.

m2 ultra gets approx 46k


not sure where you get average of 92k for the 4090 from. Non overclocked 4090 should be around 85k
 
Last edited:

Matty_TypeR

macrumors 6502a
Oct 1, 2016
641
555
UK
Ok so lets say 46k V 85k still best part of 2 x faster than the M2 ultra, so no where near equal in performance. and i think 92k was an OEM 4090 bit faster than reference 4090. And that M2 ultra is with 72 GPU cores so 144 gpu cores and it might equal a 4090.

when the M3 comes out it might match it, but it will have to deal with a 5090 which forecasts say is over 50% faster than 4090 so we shall see :) and this is desktop GPU's Nvidia workstation GPU's could even spank the desktop versions. SLI is not dead yet.
 

smckenzie

macrumors member
May 7, 2022
97
106
The M2 Ultra is still miles off for 3D rendering. A fellow Redshift user just got one (60 core GPU) and it renders the Redshift benchmark in over 5 mins:

Redshift 3.5.16 (macOS)
CPU: 24 threads, 0.00 GHz, 64.00 GB
GPU(s): [Apple M2 Ultra 48 GB 0.080ms]
Blocksize: 128
Time: 00h:05m:23s

changing the block size improves times a bit

Redshift 3.5.16 (macOS)
CPU: 24 threads, 0.00 GHz, 64.00 GB
GPU(s): [Apple M2 Ultra 48 GB 0.082ms]
Blocksize: 512
Time: 00h:04m:45s

My 7,1 with 2x6800 Duo will do the same test in 2:03s. If I throw a 6900XT in there as well that will drop down to 1:30s

4090's are doing this test in just under a min, even less with dual setups.

The verdict seems to be its great for look dev stuff but still disappointing on final render times.
 
Last edited:

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
So the M2 ultra is running 3D mark in VM windows? and is on par with a 7900xt and 3090 i would like to see those results if you have a link? the CPU score would be interesting alone.
No, it's running the iPad app, which renders at the same quality and resolution as the PC version.

The score is 46185, which is a bit better than the average of the 7900XT and 3090 scores.

The 4090 is much faster. I don't think we should compare the M2 ultra to that card just because it is the best that Apple offers. Apple didn't make the comparison this time.
I find the scores of the 4090 suspicious. Ok it's fast, but is it really twice faster than the 3090?
 
Last edited:
  • Like
Reactions: SymeonArgyrus

goMac

Contributor
Apr 15, 2004
7,663
1,694
I find the scores of the 4090 suspicious. Ok it's fast, but is it really twice faster than the 3090?

It is. 4090 had a process update over the 3090. So they bumped the power and made it more efficient.

That also means that cards like the 4080 and 4070 are also in 3090 territory - while being more mid level cards. So 3090 performance is the current gen mid level performance on the PC side.
 

thunng8

macrumors 65816
Feb 8, 2006
1,032
417
It is. 4090 had a process update over the 3090. So they bumped the power and made it more efficient.

That also means that cards like the 4080 and 4070 are also in 3090 territory - while being more mid level cards. So 3090 performance is the current gen mid level performance on the PC side.
4090 is a 450W part while the M2 ultra GPU portion is approx 70-80W.

For 3D raster performance it cannot match the 4090, but is more than 50% of the speed for 1/6 the power.

The 4070Ti is a 280W part and has the same performance as the m2 Ultra on this benchmark.

Another data point is gfxbench:


The M2 Ultra matches the RTX4080 and is approx 65% of the 4090.
 
  • Haha
Reactions: prefuse07

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,793
4090 is a 450W part while the M2 ultra GPU portion is approx 70-80W.

For 3D raster performance it cannot match the 4090, but is more than 50% of the speed for 1/6 the power.

The 4070Ti is a 280W part and has the same performance as the m2 Ultra on this benchmark.

Another data point is gfxbench:


The M2 Ultra matches the RTX4080 and is approx 65% of the 4090.

It doesn’t match the ancient amd 6900xt.
 
  • Like
Reactions: prefuse07

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
It doesn’t match the ancient amd 6900xt.
The 6900XT is slower in the 4k offscreen test, if you consider that the vast majority of the results indicate about 270-280 fps, and that outliers are probably overclocked cards or bogus results.
That 336 fps result, which you probably refer to, would make the 6900XT faster than its successor (7900XT), which isn't faster than the M2 ultra either.
 
Last edited:
  • Angry
Reactions: prefuse07

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,793
The 6900XT is slower in the 4k offscreen test, if you consider that the vast majority of the results indicate about 270-280 fps, and that outliers are probably overclocked cards or bogus results.
That 336 fps result, which you probably refer to, would make the 6900XT faster than its successor (7900XT), which isn't faster than the M2 ultra either.

Let's actually look at reality. It fails to do better on apple's own metal score. Again. Let me repeat that. It's APPLE'S OWN METAL and the M2 ultra fails to do better at it.

One more time for good measure. The 6900XT, an old card, does better at apple's own metal standard (that they endlessly tout how great it is) than the top end M2 ultra.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
We're not talking about the same thing. You were replying to a post about GFXbench, which itself followed a post about 3DMark. We were talking about rasterization, not GPGPU work.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
Redshift 3.5.16 (macOS)
CPU: 24 threads, 0.00 GHz, 64.00 GB
GPU(s): [Apple M2 Ultra 48 GB 0.080ms]
Blocksize: 128
Time: 00h:05m:23s

changing the block size improves times a bit

Redshift 3.5.16 (macOS)
CPU: 24 threads, 0.00 GHz, 64.00 GB
GPU(s): [Apple M2 Ultra 48 GB 0.082ms]
Blocksize: 512
Time: 00h:04m:45s

My 7,1 with 2x6800 Duo will do the same test in 2:03s.
Two 6800 duos, that's 4 GPUs, right? They would represent 60 TFLOPs of single-precision compute power.
I'm not surprised they perform better than the M2 ultra (< 30 TFLOPS).

Looking at the Blender results on opendata.org, the M2 Ultra (76 cores) perform as well as the 7900XT (median score), and quite a bit better than the 6900 XT. Maybe Blender doesn't work well with AMD cards...
It's also about twice faster than the M1 Ultra (64 cores).
 

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
Two 6800 duos, that's 4 GPUs, right? They would represent 60 TFLOPs of single-precision compute power.
I'm not surprised they perform better than the M2 ultra (< 30 TFLOPS).

Looking at the Blender results on opendata.org, the M2 Ultra (76 cores) perform as well as the 7900XT (median score), and quite a bit better than the 6900 XT. Maybe Blender doesn't work well with AMD cards...
It's also about twice faster than the M1 Ultra (64 cores).

Except they don't function as multiple GPUs when running benchmarks. Feel free to look that up yourself.

Further, here is a direct LINK to my RX-6800XT, which is also scoring high -- since I know you did not read this thread from the beginning.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
I will repeat, since you lack reading comprehension:
I indeed have no idea what you're trying to prove, but I don't think it relates to my reading comprehension.
Your first link has no relevance to our discussion about redshift, as it shows how the card functions under boot camp.
The second videos shows two radeon 6800 duos performing much better than a single one under redshift. You think that redshift can take advantage of two separate cards but cannot use two GPUs on a single card?

Also look at the chart below. Is a single radeon pro W6800X GPU supposed to the faster than a radeon pro W6900X GPU?


pro-w6800x-quadruple-redshift.png
 
  • Like
Reactions: SymeonArgyrus

prefuse07

Suspended
Jan 27, 2020
895
1,073
San Francisco, CA
I indeed have no idea what you're trying to prove, but I don't think it relates to my reading comprehension.
Your first link has no relevance to our discussion about redshift, as it shows how the card functions under boot camp.
The second videos shows two radeon 6800 duos performing much better than a single one under redshift. You think that redshift can take advantage of two separate cards but cannot use two GPUs on a single card?

Thanks for totally missing the point, and have fun at the apple kool-aid table. Not going to waste my time.

Also look at the chart below. Is a single radeon pro W6800X GPU supposed to the faster than a radeon pro W6900X GPU?


pro-w6800x-quadruple-redshift.png

Wow, you really are confused! Please quote exactly when and where I mentioned that a single W6800X is faster than a W6900X. I am waiting for the direct quote showing me saying something so stupid such as this... Come on.. Let's see it 🤦‍♂️


And since you won't provide any such quote (because I never said this). Stop derailing this thread, and let's keep this on topic.
 
Last edited:
  • Like
Reactions: ZombiePhysicist

jeanlain

macrumors 68020
Mar 14, 2009
2,454
948
Wow, you really are confused! Please quote exactly when and where I mentioned that a single W6800X is faster than a W6900X. I am waiting for the direct quote showing me saying something so stupid such as this... Come on.. Let's see it 🤦‍♂️
The thing you said was that the duo cards "don't function as multiple GPUs when running benchmarks".
What were you trying to say? That a W6800X duo uses a single GPU during benchmarks or that both GPUs are seen as a single one by the system? If you meant the latter, how is it relevant to the original post comparing the M2 Ultra to two W6800X duos? That's still comparing a ~100W GPU (M2 ultra) to two monsters that consume many times the power and cost more than a Mac Studio. Who cares if a card functions as multiples GPUs or not?
If you meant the former, well, the results I showed prove you are wrong, because a single W6800X GPU cannot be faster than a W6900X GPU. It means that both GPUs of the duo are used in redshift.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.