Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jujoje

macrumors regular
May 17, 2009
247
288
This is interesting. Thanks for sharing. With M1 Ultra being good at CPU and not super great at GPU rendering, maybe looking into Arnold rendering is a good idea ?‍♂️

Lentil looks pretty cool!

So far was pretty impressed with CPU rendering on the M1 Max; on the Ultra should be pretty much twice as fast which would be pretty impressive.

Still lean towards CPU rather than GPU renderers; while there are certain cases where GPU wins massively (shiny things), still feel that there's a lot to be said for the flexibility you get from CPU renderers. Also depends a lot on whether you need that flexibility though (80% of the time you don't).
 
  • Like
Reactions: sirio76

leman

macrumors Core
Oct 14, 2008
19,523
19,679
Lentil looks pretty cool!

So far was pretty impressed with CPU rendering on the M1 Max; on the Ultra should be pretty much twice as fast which would be pretty impressive.

Still lean towards CPU rather than GPU renderers; while there are certain cases where GPU wins massively (shiny things), still feel that there's a lot to be said for the flexibility you get from CPU renderers. Also depends a lot on whether you need that flexibility though (80% of the time you don't).

I think going forward the flexibility of the GPU renderers will increase. Modern RT GPU pipelines are fully programmable and at least in case of Apple Silicon you don't have to worry about GPU NUMA. But it will take time for the software ecosystem to catch up.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
Did Apple really state that it was as fast or faster than the 3090, or rather that at a relative performance of 200, the M1 Ultra uses about 115 Watts and the 3090 330 Watts? The comparative claims are probably more about energy efficiency, than performance.
The M1 Ultra shows better performance than the 3090 on the graph Apple showed.
 

ader42

macrumors 6502
Jun 30, 2012
436
390
Ain't no one got time for that. But yeah I think most benchmark scenes are under 30min; I guess it's be useful to see for sustained performance and to ensure everything spins up.
I know what you mean, sign of the times I guess. I remember when I used to make 3d animations in the 90s and each frame took an hour to render, so a second of rendered animation a day on my home computer was good going lol - and it wasn't that complex or high res in todays term - it was PAL res so only 512 x 768 pixels.

I used After Effects for compositing, this was when it required a hardware dongle (before Adobe bought them out).

I remember my boss buying a $3,000 gfx card for his PC 3DS Max (2.5?) work too - back in 1999 - can't remember what it was though.
 
  • Like
Reactions: stevemiller

stevemiller

macrumors 68020
Oct 27, 2008
2,057
1,607
I know what you mean, sign of the times I guess. I remember when I used to make 3d animations in the 90s and each frame took an hour to render, so a second of rendered animation a day on my home computer was good going lol - and it wasn't that complex or high res in todays term - it was PAL res so only 512 x 768 pixels.

I used After Effects for compositing, this was when it required a hardware dongle (before Adobe bought them out).

I remember my boss buying a $3,000 gfx card for his PC 3DS Max (2.5?) work too - back in 1999 - can't remember what it was though.
I remember the hardware dongle for my 3d studio for dos educational copy in the 90s! it might still be sitting in a box somewhere haha.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Is Nvidia's Path Tracing relevant or a marketing gimmick?
Path tracing isn't new at all. Doing it in real time on RTX hardware also isn't that new - someone made Quake 2 use path tracing in 2019, but I guess path tracing might be the future in 5-10 years time. Nobody is going to use it today though because nobody wants to release a game that only runs at 20 fps on a 3090 and has no fallback.

Personally I think Unreal Engine 5's Nanite is far more exciting and virtualised geometry is going to be the true "future" of real-time graphics. RT cores will still play a huge role for computing lighting though.

The main problem with Nanite is that it is incredibly API dependent, because its blending hardware and software rendering, which is why it's probably going to be a while before it gets ported to Metal.

Be interesting to see if we see some kind of standardised virtualised geometry implementation in the future though, but given Apple's relations with both nVidia and Epic, Apple probably isn't going to be leading the charge there.
 
  • Like
Reactions: Xiao_Xi

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,666
OBX
Path tracing isn't new at all. Doing it in real time on RTX hardware also isn't that new - someone made Quake 2 use path tracing in 2019, but I guess path tracing might be the future in 5-10 years time. Nobody is going to use it today though because nobody wants to release a game that only runs at 20 fps on a 3090 and has no fallback.

Personally I think Unreal Engine 5's Nanite is far more exciting and virtualised geometry is going to be the true "future" of real-time graphics. RT cores will still play a huge role for computing lighting though.

The main problem with Nanite is that it is incredibly API dependent, because its blending hardware and software rendering, which is why it's probably going to be a while before it gets ported to Metal.

Be interesting to see if we see some kind of standardised virtualised geometry implementation in the future though, but given Apple's relations with both nVidia and Epic, Apple probably isn't going to be leading the charge there.
Minecraft RTX (Bedrock edition) is fully path traced, it has the weird limitation of requiring RT be turned on per world though.
 

leman

macrumors Core
Oct 14, 2008
19,523
19,679
The main problem with Nanite is that it is incredibly API dependent, because its blending hardware and software rendering, which is why it's probably going to be a while before it gets ported to Metal.

I would have thought that improvementing virtualized geometry would be easier on Apple Silicon - unified memory, large caches, state of the art GPU-driven rendering and compute dispatch etc. But yeah, if Epic does implement Nanite for the Mac, they would need to start from scratch. Then again, I can imagine that many things will be much easier to do. Mesh streaming? Just mmap the file, make a metal buffer from the resulting pointer and let the OS figure out the rest…
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
Path tracing isn't new at all. Doing it in real time on RTX hardware also isn't that new - someone made Quake 2 use path tracing in 2019, but I guess path tracing might be the future in 5-10 years time.
Does this mean Apple has a chance to overtake Nvidia and get it first? Could this be the leverage Apple needs to improve gaming and 3D software on macOS?
 

Boil

macrumors 68040
Oct 23, 2018
3,478
3,174
Stargate Command
I used After Effects for compositing, this was when it required a hardware dongle (before Adobe bought them out).
I remember the hardware dongle for my 3d studio for dos educational copy in the 90s! it might still be sitting in a box somewhere haha.

I bought into EIAS (Electric Image Animation System) in the mid-1990s (also an educational copy), along with a Power Computing PowerTower Pro 225...

Then life happened, wife & kids & all that; eventually lost my dongle while moving into a new house, and the wife was not going to let me buy a new license for what was (to her) a part-time hobby...

I would have thought that improvementing virtualized geometry would be easier on Apple Silicon - unified memory, large caches, state of the art GPU-driven rendering and compute dispatch etc. But yeah, if Epic does implement Nanite for the Mac, they would need to start from scratch. Then again, I can imagine that many things will be much easier to do. Mesh streaming? Just mmap the file, make a metal buffer from the resulting pointer and let the OS figure out the rest…

Is that like improving & implementing...? Improvementing...?!? ;^p
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
Does this mean Apple has a chance to overtake Nvidia and get it first? Could this be the leverage Apple needs to improve gaming and 3D software on macOS?
Almost all 3D software are using only Nvidia GPU or CUDA. Changing that would be very difficult since all software are heavily optimized for Nvidia. This is why AMD cant even compete with 3D market.
 
  • Like
Reactions: iPadified

BootLoxes

macrumors 6502a
Apr 15, 2019
749
897
So has anyone tried UE5 yet since it is officially out? I heard Lumen was fixed in preview 2 but cannot find any sources for Macs
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Almost all 3D software are using only Nvidia GPU or CUDA. Changing that would be very difficult since all software are heavily optimized for Nvidia. This is why AMD cant even compete with 3D market.
Vendor lock-in: always poor for the enduser. If ever Apple got as big as NVIDIA in the market, I would dislike it as much as I dislike NVIDIA current dominance. Did someone say Microsoft in the 90:th and 00s?
 

Fallinangel

macrumors regular
Dec 21, 2005
200
20
So has anyone tried UE5 yet since it is officially out? I heard Lumen was fixed in preview 2 but cannot find any sources for Macs
You can look at Atom X Labs' video from yesterday on YouTube, where they take the Unreal Engine 5 for a spin on the M1 Max with 64GB memory. The performance is okay, given that the UE barely ever worked on my Intel MacBook Pro anyways, I'd say there's some progress. Nothing seems really optimized yet, though.
 

l0stl0rd

macrumors 6502
Jul 25, 2009
483
420
Have you guy seen this?


If this turns out to be accurate then it's huge! It is right there at number 1 on my Apple wishlist*. :eek:

*(actually, right below a shiny 30" Apple Wacom Cintiq killer ?)
Oh it might be if true but do I want to wait another 18 month for the Studio with the M2 Ultra. ?

Also he could just be guessing / wishful thinking.

That is like saying it will beat the Rtx 4090.
 

Lone Deranger

macrumors 68000
Apr 23, 2006
1,900
2,145
Tokyo, Japan
Oh it might be if true but do I want to wait another 18 month for the Studio with the M2 Ultra. ?

Also he could just be guessing / wishful thinking.

That is like saying it will beat the Rtx 4090.

Personally I think it's less of a matter of if, and more a matter of when.

It's an inevitable move if they want to compete with nVidia, as the benchmarks and real-world tests between the M1 and RTX's have shown us.
 

leman

macrumors Core
Oct 14, 2008
19,523
19,679
Personally I think it's less of a matter of if, and more a matter of when.

It's an inevitable move if they want to compete with nVidia, as the benchmarks and real-world tests between the M1 and RTX's have shown us.

Not to mention that Apple is the last major GPU maker without hardware RT support. Even Intel has it now.
 

StudioMacs

macrumors 65816
Apr 7, 2022
1,133
2,270
Appreciate this discussion so far. Lots of good resources and links. I particularly enjoyed reading the Stu Markowitz review of the Mac Studio and Studio Display one of you posted earlier (sorry, forgot who posted it).

I‘ve been working in photography and video for a bit, but before that I worked with Lightwave on a Power Macintosh 8500 (with a Voodoo2 graphics card) to extrude 2D logos and render simple 480i 3D animations for commercials and other useless things.

I’m looking forward to getting back into 3D, on a Mac again, so I just wanted to pop in and say this discussion has been very informative in getting me back up to speed….
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.