Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
But Nvidia will often have an advantage if the shaders themselves are large/complex, as they have more registers and more compute throughput.
As a side note about this, Apple puts a lot on emphasis on its talks about a couple of things that minimize this problems: render passes that use tile memory and function specialization for pipeline variants.

For IMR GPUs that need previous data I guess the way to go if you require that a render pass uses a resource made in a previous step and don't want to go through system or VRAM memory is to just merge those two passes into a single one. This puts more pressure into registers. But on TBDR GPUs you can run two separate render passes, where the first pass writes to tile memory and the second pass reads from there. Register pressure is minimized as you don't need to keep around registers from the previous pass around.

And same thing goes for pipeline variants. Branching instructions need registers to keep branch directions and counters, so whenever you use a pipeline variant, you know whether the branch is taken (and when) at compile time, so the compiler optimizes those registers and flow control instructions out.
 

ader42

macrumors 6502
Jun 30, 2012
436
390
I must say I’ve been loving using ZBrush on my 16” MBP; given how good it works through Rosetta I’m intrigued how good a native verison will be when we get it.
 
  • Like
Reactions: jujoje

vinegarshots

macrumors 6502a
Sep 24, 2018
983
1,349
I take your Nvidia's piss poor drivers and raise you Nvidia's piss poor denoising solution:


More seriously, Intel de-noiser now native :)

What kind of nonsense cherry-picked test is that? Totally unrepresentative of real-world performance.

Here's a test I just did myself:

NR.jpg


The Optix render and noise-reduction pass was 7 seconds faster than the Intel noise reduction pass (Intel was 30 seconds, and Optix was 23).
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
What kind of nonsense cherry-picked test is that? Totally unrepresentative of real-world performance.

Here's a test I just did myself:

View attachment 2005383

The Optix render and noise-reduction pass was 7 seconds faster than the Intel noise reduction pass (Intel was 30 seconds, and Optix was 23).
...which is also cherry picked :). A good test would be with likely 10-20 of more scenes that test different challenges in terms of rendering and de-noising. A good test would also have some metric regarding the quality of the de-noising. Speed is just one parameter. Subjective "it looks looks good" is not good enough.
 
  • Like
Reactions: Lone Deranger

jmho

macrumors 6502a
Jun 11, 2021
502
996
I believe the point of that image is that nVidia's de-noiser struggles in certain cases with telling the difference between noise and surface texture in very specific circumstances, so to call it cherry picked - yeah that's the point. They're showing that the Intel denoiser can manage special cases that trip up the nVidia one.
 

jujoje

macrumors regular
May 17, 2009
247
288
What kind of nonsense cherry-picked test is that? Totally unrepresentative of real-world performance.

The intel scene was an obviously cherry picked because of course they're going to pick the best situation for their denoiser :)

I believe the point of that image is that nVidia's de-noiser struggles in certain cases with telling the difference between noise and surface texture in very specific circumstances, so to call it cherry picked - yeah that's the point. They're showing that the Intel denoiser can manage special cases that trip up the nVidia one.

Anecdotally, my experience echoes this. Did some testing a while back and intel was definitely giving better results than nVidia which had trouble with high frequency and surface details - Intel was preserved information better and was closer to the undenoised render. nVidia's was fast, but inaccurate and prone to smearing things. Things might be different now, as this was about a year ago, and, perhaps the scenes I was testing favoured intel, but it was what I had to hand.
 
  • Haha
Reactions: mi7chy

Slartibart

macrumors 68040
Aug 19, 2020
3,145
2,819
Well, and here some impressions and comments on M1 Pro and Max from the guy who works as a senior software engineer at Walt Disney Animation Studios on Disney's Hyperion Renderer. Its basically a CPU-render comparision using his Takua-renderer.

In a nutshell: he expected incredible performance, but was surprised to find even his high expectations met and surpassed.

Enjoy the images! 🤓
 
Last edited:

jujoje

macrumors regular
May 17, 2009
247
288
As a bit of an aside, it’s worth checking out the other posts in that guys blog; some really interesting posts there from technical stuff like simd on arm and x86-64 (tl;dr) to more artistic things like his entries for the Renderman art challenges. Some really cool stuff :)
 
  • Haha
Reactions: mi7chy

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
Blender will get Intel denoiser for all GPUs soon. 🤞
  • Intel OpenImageDenoise recently gained GPU supported, that should also work on NVIDIA, AMD and Apple devices. Stefan started prototyping an implementation in Cycles. Quality is identical to CPU denoising with improved performance, so this looks very promising.
 

Quixotic3

macrumors member
Mar 25, 2009
62
38
I am waiting to see if the new MacPro will come with a M2 Pro/Max/Ultra/Plaid variants and wether or not hardware ray tracing is enabled before I purchase. Do you think that any future hopes of AAA gaming on a Mac depends on this feature?
 

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
I am waiting to see if the new MacPro will come with a M2 Pro/Max/Ultra/Plaid variants and wether or not hardware ray tracing is enabled before I purchase. Do you think that any future hopes of AAA gaming on a Mac depends on this feature?
Similar as M1 GPUs are disappointing.

The 32 core barely touches a 5700XT.

This sums up my feelings pretty much.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Do you think that any future hopes of AAA gaming on a Mac depends on this feature?

Ray tracing? Hardly. How would hardware ray tracing make these GPUs better for gaming?

Similar as M1 GPUs are disappointing.

The 32 core barely touches a 5700XT.

Sounds about right. The 32-core M1 GPU has the same 10TFLOPS and similar effective RAM bandwidth as a 5700XT. Just in a much smaller form factor and at a fraction of power consumption.

P.S. And of course, the decisive advantage of Apple GPUs is that have much more RAM and therefore can work on much larger datasets than consumer GPUs.
 
Last edited:
  • Like
Reactions: l0stl0rd

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
what about the 64 core? does it touches the 5700XT or surpass it and goes for the 6700XT-6800XT?
I don’t have a 64 core but I have access to a 32 core and a 5700 XT.
Pretty sure the 64 core surpasses the 5700 XT by how much or to what it compares no idea.
However considering the pice of the 64 core I probably would build a PC or Hackintosch.

Nearly 5.6k euros for the 64 core sure is a lot. You probably can easily build a PC with a 3090 for quite a bit less.

Lets see what they do with the Mac Pro if they show it at WWDC.
 
Last edited:

MayaUser

macrumors 68040
Nov 22, 2021
3,178
7,204
I don’t have a 64 core but I have access to a 32 core and a 5700 XT.
Pretty sure the 64 core surpasses the 5700 XT by how much or to what it compares no idea.
However considering the pice of the 64 core I probably would build a PC or Hackintosch.

Nearly 5.6k euros for the 64 core sure is a lot. You probably can easily build a PC with a 3090 for quite a bit less.

Lets see what they do with the Mac Pro if they show it at WWDC.
based on your idea, the Mac Pro will be also more expensive and you probably can easily build a PC
 

jujoje

macrumors regular
May 17, 2009
247
288
I am waiting to see if the new MacPro will come with a M2 Pro/Max/Ultra/Plaid variants and wether or not hardware ray tracing is enabled before I purchase. Do you think that any future hopes of AAA gaming on a Mac depends on this feature?

I am there for the plaid variant :p

I don’t have a 64 core but I have access to a 32 core and a 5700 XT.
Pretty sure the 64 core surpasses the 5700 XT by how much or to what it compares no idea.
However considering the pice of the 64 core I probably would build a PC or Hackintosch.

Nearly 5.6k euros for the 64 core sure is a lot. You probably can easily build a PC with a 3090 for quite a bit less.

Lets see what they do with the Mac Pro if they show it at WWDC

Generally feel that if you're getting MP for GPU rendering you're going to be disappointed. I imagine that the Mac Pro is going to be pretty reasonable in terms of price performance on the CPU front, but on the GPU front... probably not so much.

I just hope Apple actually detail their GPU benchmarks this time, otherwise we're probably in for months of spurious speculation, trolling, dubious benchmarks and general hype and hysteria on the forums. The again that is how Macrumors rolls.

yeah probably unless it has the same starting price as the base Studio ultra but I doubt it.

I dunno, I wouldn't be surprised if the lowest configuration was a a few hundred more than the Ultra with similar specs (cost of shiny Pro case and I/O), but then goes rapidly up from there.
 

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
Came across this breakdown of the MacBook Pro ad graphics. Pretty cool :) Apple stuff starts around 15.11, but the whole presentation is worth watching; some pretty great work :)


Thanks for this, brilliant stuff!
 

JanErik75

macrumors member
Aug 3, 2014
36
43
Norway
WWDC 2022 announcement. The M2 family is coming and it doesn’t have dedicated hardware 3D cores for ray tracing etc. Glad to see Apple still not taking 3D artists seriously 🤡
 
Last edited:

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
WWDC 2022 announcement. The M2 family is coming and it doesn’t have dedicated hardware 3D cores for ray tracing etc. Glad to see Apple still not taking 3D artists seriously 🤡
Yes but gaming strangely enough.

Was kind of weird to see Resident Evil and No Mans Sky in an Apple keynote.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
Yes but gaming strangely enough.

Was kind of weird to see Resident Evil and No Mans Sky in an Apple keynote.
It’s basically the same GPU core as the A15 so I’m not surprised there are no ray acceleration cores. I wouldn’t expect this till the A16 or later…

Edit: and it looks like we are getting the apple equivalent to DLSS and whatever AMD is calling their version today…
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
The vanilla M2 was never going to be a 3D artist machine. Obviously I'd have loved to have seen RT cores, but it would also be kinda weird for them to release them on a MacBook Air.

There is a WWDC session on Thursday though called "Maximise your Metal ray tracing performance", so we'll see just how "hardware friendly" that is.

I'll only start freaking out when we see the Mac Pro and it doesn't have RT cores.
 

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
The vanilla M2 was never going to be a 3D artist machine. Obviously I'd have loved to have seen RT cores, but it would also be kinda weird for them to release them on a MacBook Air.

There is a WWDC session on Thursday though called "Maximise your Metal ray tracing performance", so we'll see just how "hardware friendly" that is.

I'll only start freaking out when we see the Mac Pro and it doesn't have RT cores.
Yes I noticed that one and will have a look at this too I guess.

As for RT cores, well M2 got Pro Res with the M1 does not have so perhaps M2 Pro or Mac get RT cores as a new thing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.