Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,521
19,675
You generally won't use the Metal Shading Language directly for ray tracing (unless you want to do it all manually). The easiest way is to use the MetalPerformanceShaders framework with its own acceleration structures:


There are also some lower level functions specifically for adding acceleration structures and intersection functions to a commandEncoder (but they're probably just using the MPS shaders I'd guess).

MPS was the early high-level RT API but Apple has introduced a full-fledged programmable RT API that lets you combine rasterization and RT or implement your custom behavior on the basis of RT primitives. From what I understand, the new API supersedes MPS RT and advanced applications should use the new RT API going forward.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Says the same guy who referred to apple's own marketing slides pre-release of the M1s as if those slides provided conclusive evidence of their performance....which has now been thoroughly debunked.
“Debunked” if you weren’t grounded in reality in the first place. I think it was obvious to anyone with realistic expectations that the Max would perform similar in best case scenarios only. That’s the way marketing works, unfortunately.

I’m sure in the tests that NVidia ran in their marketing that their gpus will perform similarly to their claims. But that’s the best case scenario.
 
  • Like
Reactions: sirio76

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
Apples marketing slides were actually fairly accurate.
I find claims regarding Apple GPU performance dubious or misleading.
Apple's "industry standard" benchmarks were not disclosed, so the methodology is even more obscure than nVidia's. I suspect Apple used the most favourable benchmark app, i.e., GFXBench.

And what about the claim that the M1 Max is 3X faster than the RTX 3080 on battery power? Has anyone replicated that result?

Not sure why you think nVidia does marketing BS and Apple doesn't. nVidia picked Octane and Redshift, which have Metal versions.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
MPS was the early high-level RT API but Apple has introduced a full-fledged programmable RT API that lets you combine rasterization and RT or implement your custom behavior on the basis of RT primitives. From what I understand, the new API supersedes MPS RT and advanced applications should use the new RT API going forward.

I'd be very surprised if MPS wasn't built on top of the new RT API.

I mean at the core both of them are just generating compute kernel code for you. It just depends how much control you need. Obviously if you want to do hybrid rendering you'll need the new API features, but if you just want to make a simple path tracer MPS should be enough.

I don't know 100% though, MPS could be on its way out but I haven't seen anything about it.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I find claims regarding Apple GPU performance dubious or misleading.
Apple's "industry standard" benchmarks were not disclosed, so the methodology is even more obscure than nVidia's. I suspect Apple used the most favourable benchmark app, i.e., GFXBench.

Oh, I completely agree that Apples GPU performance curves were entirely silly. Still, their claims do hold up for tasks like photo/video editing and synthetic gaming benchmarks and have been verified multiple times, so I was surprised that someone would describe them as “debunked”.

Regarding compute and RT performance, well, those things were known long before M1 Max was even released. You can’t really expect Apple to compete on trivialish workloads where Nvidia has much more computational resources and dedicated RT hardware. Again, no surprises here.

I'd be very surprised if MPS wasn't built on top of the new RT API.

I mean at the core both of them are just generating compute kernel code for you. It just depends how much control you need. Obviously if you want to do hybrid rendering you'll need the new API features, but if you just want to make a simple path tracer MPS should be enough.

I don't know 100% though, MPS could be on its way out but I haven't seen anything about it.

MPS predates the RT API. It was Apples first attempt to offer RT on the GPU and is less flexible than the new RT API. And sure, it just generates shader code. Just like OptiX or DX12 RT :)
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
Oh, I completely agree that Apples GPU performance curves were entirely silly. Still, their claims do hold up for tasks like photo/video editing and synthetic gaming benchmarks and have been verified multiple times, so I was surprised that someone would describe them as “debunked”.
How could one verify Apple claims without knowing the methodology they used?
Apple can only blame themselves if people think they have been debunked by nVidia.
They could at least have said "using Apple Silicon native gaming benchmark tools" (which means GFXBench, 3DMark Wild Life and/or Basemark GPU), but even this was too much to ask apparently.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
MPS predates the RT API. It was Apples first attempt to offer RT on the GPU and is less flexible than the new RT API. And sure, it just generates shader code. Just like OptiX or DX12 RT :)
OptiX and DX12 RT don't generate shader code, at least not directly. They talk to the driver which knows what to do for the certain hardware, which could be generating pure shader code or it could involve RT / RA cores on Nvidia / AMD cards (and whatever Intel are doing with their cards).

Obviously this is incredibly pedantic because if/when Apple get ray tracing cores obviously their new API will allow them to switch out their shader code generation with a proper RT hardware interface. I guess this is the big reason for the API being there today, even though without RT hardware it's not going to be any faster than the "old" MPS code - I mean nobody is actually going to do hybrid raytracing on Mac at the moment (unless they're sitting on an NDA and know what's coming down the pipeline)

Sure, MPS predates the RT API but that doesn't mean much. MPS is exactly what it says on the tin, a bunch of compute shaders that you're free to use to save you time - all wrapped up in a framework so they can change the underlying implementation without breaking anything.

No matter which path devs have chosen for raytracing, when hardware support comes it should hopefully all "just work"
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
OptiX and DX12 RT don't generate shader code, at least not directly. They talk to the driver which knows what to do for the certain hardware, which could be generating pure shader code or it could involve RT / RA cores on Nvidia / AMD cards (and whatever Intel are doing with their cards).

I do not know how Nvidia does it, but on AMD hardware at least RT functionality is invoked using special GPU instructions. I would imagine that Nvidia does something similar. After all, RT functionality is tightly interwoven with general-purpose shader cores - the RT hardware might do the hierarchy traversal but all the other processing (ray generation, shading, custom intersection functions etc.) all run as regular shader code. The API design reflects it fairly well.

Overall, RT probably works similar to texturing - you have some special dedicated instructions that invoke the respective coprocessor (be it RT hardware or texturing hardware) and the thread is suspended until the result of operation is done.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I do not know how Nvidia does it, but on AMD hardware at least RT functionality is invoked using special GPU instructions. I would imagine that Nvidia does something similar. After all, RT functionality is tightly interwoven with general-purpose shader cores - the RT hardware might do the hierarchy traversal but all the other processing (ray generation, shading, custom intersection functions etc.) all run as regular shader code. The API design reflects it fairly well.

Overall, RT probably works similar to texturing - you have some special dedicated instructions that invoke the respective coprocessor (be it RT hardware or texturing hardware) and the thread is suspended until the result of operation is done.
I am not sure that is 100% accurate, if I am not mistaken nvidia hardware is able to process shader code while ray testing is occuring (AMD cannot) which is why nvidia design is faster than AMDs at accelerating RT (even Turing is faster).
 
  • Like
Reactions: jmho

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I am not sure that is 100% accurate, if I am not mistaken nvidia hardware is able to process shader code while ray testing is occuring (AMD cannot) which is why nvidia design is faster than AMDs at accelerating RT (even Turing is faster).
That is not to say that threads don’t stall on nvidias design, but that if they do it doesn’t seem to bother that hardware as much as it does AMDs
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
I am not sure that is 100% accurate, if I am not mistaken nvidia hardware is able to process shader code while ray testing is occuring (AMD cannot) which is why nvidia design is faster than AMDs at accelerating RT (even Turing is faster).
This is presumably why Nvidia needed a complete new API in OptiX, instead of just adding ray traversal functions directly to CUDA.

Plus OptiX looks like the simplest of all RT APIs, which is very nice (for Nvidia), and probably explains why OptiX is so well supported.

The downside being that Metal's RT api does work a lot like, as Leman says, a texturing sampler, where you have a compute kernel, pass in a bunch of raytracing objects and call the intersect function on an intersector and wait for the result. This probably means that if Apple does get RT cores they'll block like AMD's instead of being like Nvidia's which operate asynchronously and then trigger shading code when they complete.
 

Digital_Sousaphone

macrumors member
Jun 10, 2019
64
63
What are you dragging this marketing BS here for? Anyone can draw slides with any numbers they want. This is utterly meaningless without detailed methodology and benchmark breakdown.

Apples marketing slides were actually fairly accurate. Do you refer to anything specific?

Oh, I completely agree that Apples GPU performance curves were entirely silly. Still, their claims do hold up for tasks like photo/video editing and synthetic gaming benchmarks and have been verified multiple times, so I was surprised that someone would describe them as “debunked”.
The rtx slide was absolutely the most talked about slide on this forum. It created a lot of chatter and "what ifs" about how that chip would scale into the higher tiered chips. Where do you think the OMG LETS GAME ON MACS threads sprouted from? You've got quite the short memory. None of that slides info came to fruition. The highest tiered m1 chip is no better than a 1050ti in real world tests...You and your fellow fanboy's theories about how it would scale as well as the claimed performance in the slide were thoroughly debunked with their releases.
 
Last edited:
  • Haha
Reactions: robco74

leman

macrumors Core
Oct 14, 2008
19,521
19,675
The rtx slide was absolutely the most talked about slide on this forum. It created a lot of chatter and "what ifs" about how that chip would scale into the higher tiered chips. Where do you think the OMG LETS GAME ON MACS threads sprouted from? You've got quite the short memory. None of that came to fruition...you and your fellow fanboys and their theories about how it would scale were thoroughly debunked with their releases.

Gaming performance of M1 Pro and M1 Max is exactly as predicted over a year ago. The M1 Max is between mobile 3070 and 3080 in GFX bench and 3dmark. Not quite sure what you think was “debunked” and how. The problem with gaming in Mac is lack of high-quality games, that’s it.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
Gaming performance of M1 Pro and M1 Max is exactly as predicted over a year ago. The M1 Max is between mobile 3070 and 3080 in GFX bench and 3dmark. Not quite sure what you think was “debunked” and how. The problem with gaming in Mac is lack of high-quality games, that’s it.
High quality well optimized games, Bauldurs gate 3 appears to make much better use of the hardware (performance and visual fidelity attained) than does something like the the Tomb Raider franchise.
 

LinkRS

macrumors 6502
Oct 16, 2014
402
331
Texas, USA
Gaming performance of M1 Pro and M1 Max is exactly as predicted over a year ago. The M1 Max is between mobile 3070 and 3080 in GFX bench and 3dmark. Not quite sure what you think was “debunked” and how. The problem with gaming in Mac is lack of high-quality games, that’s it.
This seems a bit off-topic for this post, but I'll bite. I have an almost 3 week old 16" MacBook Pro with M1 MAX 32 Core GPU system. It replaced a 2019 16" MacBook Pro with the 8GB 5500 AMD GPU. My MBP is not my primary gaming machine, I have desktop with a RTX 3080 for that, but I do play one. My primary Mac game is Diablo 3, and I also run Unreal Engine for hobbyist dev work too. Diablo 3 is undoubtedly an OpenGL based game, and it runs horribly on the 2019 system. Framerate is OK (between 40 and 70 fps at "default" settings), but the biggest issue (and why I say horrible) is texture corruption. I chalk this up to the deprecated OpenGL support in macOS. Apple has not had very good OpenGL drivers for awhile now, as they have been pushing Metal (which they should). I was pleasantly surprised by my M1 Pro system, as Diablo 3 (under Rosetta, as Diablo 3 has not been released as a universal binary) runs at the same framerate. But here is the kicker, the texture corruption is gone! I assume that either the OpenGL drivers for the M1 MAX GPU are either better than the AMD drivers, or somehow they are translating OpenGL to Metal under Rosetta 2? Regardless, it is a better experience. Unreal Engine 4.27 also runs just about as well on the M1 MAX as it did on the 2019, and like Diablo 3 it too is Rosetta right now. I do not actually know if UE is using OpenGL or Metal on macOS?

My main point, is in my real-world usage, my M1 MAX GPU runs about the same as my AMD 5500 did. This is of course my workload. If I were basing my expectations off of artificial benchmarks like 3Dmark and GFX bench (for example my GeekBench comparison https://browser.geekbench.com/v5/compute/compare/1007762?baseline=3912042) I should expect the M1 MAX to be almost 50% faster, and I simply don't see it.

Don't get me wrong, I am very happy with my M1 MAX so far, but I do not think it is ready for any sort of 3D Rendering. For that, I'll stick with my 3080 equipped desktop :). Once Blender is out of alpha/beta, I may give that a try, but I still expect my desktop to be an order of magnitude faster
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
High quality well optimized games, Bauldurs gate 3 appears to make much better use of the hardware (performance and visual fidelity attained) than does something like the the Tomb Raider franchise.

Pretty much this. I mean, let’s be honest, gaming on Mac is in a fairly pitiful state, but if we want to talk about the actual capabilities of this new hardware (as opposed to how poorly it is utilized by selected software), the comparison point must be the best case, not the worst case. If a modern benchmark shows that the M1 Max can perform on par with an RTX 3070, well, that’s much more useful information than the fact that some old game performs poorly on M1 Max.

I assume that either the OpenGL drivers for the M1 MAX GPU are either better than the AMD drivers, or somehow they are translating OpenGL to Metal under Rosetta 2? Regardless, it is a better experience. Unreal Engine 4.27 also runs just about as well on the M1 MAX as it did on the 2019, and like Diablo 3 it too is Rosetta right now. I do not actually know if UE is using OpenGL or Metal on macOS?

Apple Silicon systems do not have any OpenGL drivers, they implement OpenGL as a compatibility library on top of Metal (plus some undocumented extensions for implementing things that Metal does not officially expose). It is also my experience that OpenGL games run much better on ARM Macs than they ever did with native OpenGL drivers. For example 7 days to die (not a demanding game by all means but fairly amateurishly coded) suffered from micro-stuttering on all Intel Mac hardware I ever tried it with. On ARM Macs it runs smooth as butter.

Don't get me wrong, I am very happy with my M1 MAX so far, but I do not think it is ready for any sort of 3D Rendering. For that, I'll stick with my 3080 equipped desktop :). Once Blender is out of alpha/beta, I may give that a try, but I still expect my desktop to be an order of magnitude faster

Apple G13 has no chance competing in RT workloads against Nvidia GPUs that have hardware RT acceleration, it is simply not realistic, at least not in simple scenes. Once we get to much more complex scenes where Apple can play its memory and cache size advantage - maybe. I still haven’t seen any tests of Blender 3.1 alpha with work that people actually do - it’s mostly standard blender benchmarks at low resolutions that make very little sense to me (but then again I’m not an artist).
 
  • Like
Reactions: JMacHack

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
The rtx slide was absolutely the most talked about slide on this forum. It created a lot of chatter and "what ifs" about how that chip would scale into the higher tiered chips. Where do you think the OMG LETS GAME ON MACS threads sprouted from? You've got quite the short memory. None of that slides info came to fruition. The highest tiered m1 chip is no better than a 1050ti in real world tests...You and your fellow fanboy's theories about how it would scale as well as the claimed performance in the slide were thoroughly debunked with their releases.
The M1 was competing with the 1050ti, not the pro or max (lol, highest tier).

And anybody outside your strawman knew that the Max would perform where it was predicted to in Metal-optimized situations. The “OMG GAME ON MAC “ discussion has revolved around games being ported to Metal so that performance can be achieved (though I shouldn’t expect your to do much reading).

Yes, we know that the M1 Max doesn’t perform like a 3080 in every case. It performs, as expected, in workloads that leverage it’s inherent advantages.

Next time, read actual discussions before you shriek about fanbois.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
I still haven’t seen any tests of Blender 3.1 alpha with work that people actually do - it’s mostly standard blender benchmarks at low resolutions that make very little sense to me (but then again I’m not an artist).
that's the problem with this topic, here almost nobody seems to do serious 3D stuff in real life, just a bunch of people more interested in benchmarks or hypothetical features of future softwares rather than use what it works now. That tells me that many here are mostly hobbyist.
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
Blender on Mac is basically exclusively for hobbyists at the moment.

You'd have to be insane to buy an M1 Max to do paid professional 3D work. Even as a lowly hobbyist even I don't use my M1 Max for 3D content creation. I just peek at how Blender is progressing, maybe run some benchmarks, and then go back to using my desktop PC because it's way faster and way less buggy.

The fact that people who have never done 3D before can play with Blender on their MacBook Pro is very cool though.
 

Boil

macrumors 68040
Oct 23, 2018
3,477
3,173
Stargate Command
Blender on Mac is basically exclusively for hobbyists at the moment.

You'd have to be insane to buy an M1 Max to do paid professional 3D work. Even as a lowly hobbyist even I don't use my M1 Max for 3D content creation. I just peek at how Blender is progressing, maybe run some benchmarks, and then go back to using my desktop PC because it's way faster and way less buggy.

The fact that people who have never done 3D before can play with Blender on their MacBook Pro is very cool though.

I wish I could use my PC for 3D, but the Nvidia GTX 650 Ti (2GB VRAM) in there is really slowing things down...!

I had a 5700XT 50AE in there for awhile, but it ran SO hot, especially the VRAM, so I moved it down the road; then I was waiting to get a 6800XT, but the GPU market went crazy (thanks miners, thanks corps wanting more profits)...

So now I await a Mac mini like the one in my sig, and then I will hobby around in Blender...! ;^p
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
Blender on Mac is basically exclusively for hobbyists at the moment.

You'd have to be insane to buy an M1 Max to do paid professional 3D work. Even as a lowly hobbyist even I don't use my M1 Max for 3D content creation. I just peek at how Blender is progressing, maybe run some benchmarks, and then go back to using my desktop PC because it's way faster and way less buggy.

The fact that people who have never done 3D before can play with Blender on their MacBook Pro is very cool though.
Seems you imply that without super fast GPU renderer options or a finalized Blender or powerful desktop like performance M1 machine are worthless, that is simply a false and very limited vision of the 3D world. Luckily there are professional options that works right now and natively and that people use to make money, and no they are not insane, M1 Max viewport speed in DCC apps is probably the best I’ve seen on a laptop and CPU/render speed is on par or superior to competing products (yes people do use laptop for serious 3D work when they need a portable solution).
DCC apps I’ve seen aren’t buggy at all in my experience, even when running Rosetta2. Probably you think that all softwares are buggy like Blender, again this is a limited vision and you should not do the error to consider your experience/needs/preferences as that of the whole 3D community.
BTW I’m curios to see one of your 3D work, just to be sure it can’t be done on an M1 machine;)
 
  • Like
Reactions: ader42 and tRYSIS3

jmho

macrumors 6502a
Jun 11, 2021
502
996
Seems you imply that without super fast GPU renderer options or a finalized Blender or powerful desktop like performance M1 machine are worthless, that is simply a false and very limited vision of the 3D world. Luckily there are professional options that works right now and natively and that people use to make money, and no they are not insane, M1 Max viewport speed in DCC apps is probably the best I’ve seen on a laptop and CPU/render speed is on par or superior to competing products (yes people do use laptop for serious 3D work when they need a portable solution).
DCC apps I’ve seen aren’t buggy at all in my experience, even when running Rosetta2. Probably you think that all softwares are buggy like Blender, again this is a limited vision and you should not do the error to consider your experience/needs/preferences as that of the whole 3D community.
BTW I’m curios to see one of your 3D work, just to be sure it can’t be done on an M1 machine;)

This is a weird post because it's just aggressive for no real reason.

I'm not a professional artist, but I am a (former) professional game engine programmer with 10+ years of experience in the games industry and as such know a large number of professional artists. Almost none of them use Blender (because while some professionals do use it, and it's definitely pro-level software, most Blender users are hobbyists), and literally none of the professional artists I know personally use Macs.

I'm not going to post any of my 3D work because I don't want to dox myself, also it's not important to me to prove my artistic skill to you - it's literally my hobby and I've admitted as much. You're right though that as someone who mostly just does character sculpting and game assets I could absolutely get by on an M1 Max.

My desktop PC was cheaper than my M1 Max, and is faster and less buggy, so why would I want to use my M1 Max for 3D?

Feel free to post your 3D work though.
 
  • Like
Reactions: mi7chy

jmho

macrumors 6502a
Jun 11, 2021
502
996
Also if you were a professional artist you would know that just posting your own professional work in a thread because some random person challenged you is a terrible idea because a) you don't own that work, and b) nobody wants to drag their employer into a silly internet argument.
 
  • Like
Reactions: sputnikBA

vel0city

macrumors 6502
Original poster
Dec 23, 2017
347
510
The thing I've learned about 3D and creative workflows in general is that everyone's requirements and expectations are completely different. So whilst I find my M1 Max/16"/64GB absolutely amazing for my own particular 3d workflow (C4D, ZBrush, Redshift, working on projects for the entertainment industry) I can understand that someone else might prefer another machine for their own workflow. It's not only the 3D work itself that dictates the workflow, it's delivery requirements, asset creation, deadlines, working with a professional pipeline, there are loads of factors influencing whether a particular machine is suitable for you.

I'm lucky to be able to use the M1 Max and enjoy a completely seamless, silent and blazing fast workflow for my own needs. It's a wonderful machine for Adobe CC and C4D/Redshift. I have it hooked up to a Pro Display XDR and its performance astounds me on a daily basis. C4D R24 runs beautifully for my work and Redshift's IPR is almost instant. I couldn't be happier with this setup. But I totally get that for someone else they might struggle to make it work for them.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
This is a weird post because it's just aggressive for no real reason.

I'm not a professional artist, but I am a (former) professional game engine programmer with 10+ years of experience in the games industry and as such know a large number of professional artists. Almost none of them use Blender (because while some professionals do use it, and it's definitely pro-level software, most Blender users are hobbyists), and literally none of the professional artists I know personally use Macs.

I'm not going to post any of my 3D work because I don't want to dox myself, also it's not important to me to prove my artistic skill to you - it's literally my hobby and I've admitted as much. You're right though that as someone who mostly just does character sculpting and game assets I could absolutely get by on an M1 Max.

My desktop PC was cheaper than my M1 Max, and is faster and less buggy, so why would I want to use my M1 Max for 3D?

Feel free to post your 3D work though.
Sorry if my post seems aggressive to you, it was not my intention. The reason why I asked to see some of your works is because many people here are doing bold claims about what is good and what is bad for 3D without actually working on 3D stuff other than for hobby. Don’t get me wrong, it’s perfectly fine to be an hobbyist (I think most 3D artist including me started like that) but claiming some machine isn't good for some tasks without actually having use it for real work is just silly.
Comparing your desktop to a laptop is pointless, MBP are laptop and you should compare price performance with similar systems. Also the price is not that important, many 3D artists I know can easily get 3/500$ a day, that means in about a week you can afford a MBP.
Also claiming that you don’t know personally any professional artist using Mac doesn’t mean that much.. I do know personally many professional who do use Macs, other use Windows, other Linux, again you should not take your personal experience like if the whole 3D world is not using Macs, personal experience may vary.
There’s nothing wrong in posting works here or elsewhere when needed, again you are making the mistake to think that everybody works for some employers and therefore can not show their stuff. Well, many people are self employed and can show their professional and personal works on their websites, social media, etc, actually that’s how many of them find new clients. From time to time I’ve posted some of my works here so if you are interested just look in my post history, the latest thing I’ve posted is a small video on page 12 of this topic, recorded on an M1 pro with 16core GPU (the machine is not mine and I not personally interested in one since I do not need a portable system).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.