Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Not true at all. Before Apple Silicon, Apple never had a top of the line CUDA Gpu in their Macs either. And things were fine. What you are complaining about has nothing to do with Apple Silicon.
I do recall folks complaining that Apple stopped using nvidia GPUs.
 
  • Like
Reactions: ThunderSkunk

jmho

macrumors 6502a
Jun 11, 2021
502
996
Not true at all. Before Apple Silicon, Apple never had a top of the line CUDA Gpu in their Macs either. And things were fine. What you are complaining about has nothing to do with Apple Silicon.
Before AS transition: Nobody is developing cutting edge 3d stuff on Mac because they don't have cutting edge GPUs.
After AS transition: Nobody is developing cutting edge 3d stuff on Mac because they don't have cutting edge GPUs.

Whichever platform has the fastest GPUs, will always have the best software.

I think it would be a very good thing if Apple made a powerful GPU.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Not true at all. Before Apple Silicon, Apple never had a top of the line CUDA Gpu in their Macs either. And things were fine. What you are complaining about has nothing to do with Apple Silicon.

They were hardly fine. Nobody was doing any GPU work on Macs at all.

In fact, things are much better now with Apple Silicon where we actually get support for popular software tools and efforts to port things to Metal. But lack of high-end GPUs is definitely discouraging.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
The thing about 3D is that it's such an incredibly huge topic that almost nobody can be an expert in everything, so we all have to be a little bit humble.
I agree that this world is very vast and no one can claim to know everything. That being said, it’s not a matter of being humble:) people come here to get some information about the state of hardware/software on Mac. If all they get is a bunch of people that barely know how to work on a 3D software, you may get a very wrong idea of what is like to work on this platform.
 

iBug2

macrumors 601
Jun 12, 2005
4,540
863
They were hardly fine. Nobody was doing any GPU work on Macs at all.

In fact, things are much better now with Apple Silicon where we actually get support for popular software tools and efforts to port things to Metal. But lack of high-end GPUs is definitely discouraging.
By "things were fine" I just meant business as usual, not that Apple was doing great on the 3d front. This has been an issue with Apple forever, and after AS transition Macs still don't have the best GPU's, but they have much better GPU's than before. So this transition couldn't have been a bad move on the GPU front.
 

iBug2

macrumors 601
Jun 12, 2005
4,540
863
Before AS transition: Nobody is developing cutting edge 3d stuff on Mac because they don't have cutting edge GPUs.
After AS transition: Nobody is developing cutting edge 3d stuff on Mac because they don't have cutting edge GPUs.

Whichever platform has the fastest GPUs, will always have the best software.

I think it would be a very good thing if Apple made a powerful GPU.
Apple does make really powerful GPU's. What you want for Apple is them to make the most powerful GPU, which I don't think is necessary to get developer support for 3D. If that were the case, then all those developers are only developing for RTX 4090 boxes, which is obviously false.
 
  • Like
Reactions: sirio76

ader42

macrumors 6502
Jun 30, 2012
436
390
For Modeling wise specially on ZBrush Ultra is a beast, very fast texturing and painting workflow, way faster on UV unwraping anything I used before, Remeshing is solid and impressive in some cases, sculpting is vey smooth, up to 35.000.000 polys for a single object, I did not calculate polys for multiple subtools I have no idea, very very fast Displacement map calculations and so go on for every aspect.

Yeah recently on my M1 Max I’ve been using scenes with over 50 million polygons and around 50 subtools with no issues. Even when I have an 8k document size rendering is nice and quick too (but rendering is less than 10% of my overall 3d work time, the rest is modelling + texturing etc). ZBrush is so well optimised I don’t understand how it does it - I can only imagine how good it will become now it’s under the Maxon umbrella.
 
  • Like
Reactions: aytan

jmho

macrumors 6502a
Jun 11, 2021
502
996
Apple does make really powerful GPU's. What you want for Apple is them to make the most powerful GPU, which I don't think is necessary to get developer support for 3D. If that were the case, then all those developers are only developing for RTX 4090 boxes, which is obviously false.
They're not developing FOR 4090 boxes, but I guarantee they're developing ON 4090 boxes.

I think you're confusing "making a Mac port" (which lots of people are doing) with "actively developing on Mac" (which literally nobody is doing)
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
Apple does make really powerful GPU's. What you want for Apple is them to make the most powerful GPU, which I don't think is necessary to get developer support for 3D. If that were the case, then all those developers are only developing for RTX 4090 boxes, which is obviously false.
M1(8c) 2.6 Tflops

GTX 1650 2.984 Tflops

M2(10c) 3.6 Tflops

GTX 1060 3GB 3.9 Tflops

M1 Pro(16c) 5.2 Tflops

RTX 3050ti 5.3 Tflops

GTX1660ti 5.4 Tflops

GTX 1070 6.4 Tflops

RTX 2080 10.07 Tflops

M1 Max(32c) 10.4 Tflops

RTX 3060 12.7 Tflops

M1 Ultra(64c) 20.8 Tflops

RTX 3070ti 21.75 Tflops

RTX 3080ti 34.1 Tflops

Not really. M1 series GPU is slower than RTX 30 series. What makes you think they are fast? Beside, Nvidia has CUDA which is heavily optimized for 3D works. Apple has none.
 

avkills

macrumors 65816
Jun 14, 2002
1,226
1,074
(whisper) it’s called “Metal”

And since when CUDA is optimized for 3D? It has literally nothing to do with 3D. How me one game written in CUDA 😂
Perhaps you have heard of this crappy game engine called Unreal, that happens to utilize CUDA libraries for certain things that need to be calculated. :rolleyes:

This comment alone tells me you have no idea how games are developed. And to make matters worse for you, we are not talking about gaming either, we are talking about visual effects for film/tv and scientific simulations.
 

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
(whisper) it’s called “Metal”

And since when CUDA is optimized for 3D? It has literally nothing to do with 3D. How me one game written in CUDA 😂
I guess you dont even use 3D related software after all. They are heavily optimized for CUDA and without it, you get extremely slow performance. That's why Nvidia is dominating the external GPU market by 80~90%. 3D market is dominated with Nvidia GPU and that's why Mac is slow for 3D works.

Metal? METAL? Do you even know that Metal is far from competing and comparing with PC? It's just a joke to them. You cant even compare Metal to CUDA. Not even close and not even comparable.
 
Last edited:

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
Perhaps you have heard of this crappy game engine called Unreal, that happens to utilize CUDA libraries for certain things that need to be calculated. :rolleyes:

This comment alone tells me you have no idea how games are developed. And to make matters worse for you, we are not talking about gaming either, we are talking about visual effects for film/tv and scientific simulations.
He has no idea what he is talking about. Literally a narrowed knowledge that only Mac users can think of. Mac never been known for 3D anyway as they ditched Nvidia GPU a while ago. Hell, without CUDA, I cant even think about working for 3D works. In 3D world, CUDA is not replaceable. AMD sucks anyway.
 
Last edited:
  • Haha
Reactions: Romain_H

sunny5

macrumors 68000
Jun 11, 2021
1,838
1,706
Well, there is obviously too much dependency on CUDA in 3D apps, gonna need the Department of Justice and the Federal Trade Commission to step in and bring a stop to this monopoly...!
Screenshot 2022-12-21 at 8.17.20 PM.jpg

Screenshot 2022-12-21 at 8.17.16 PM.jpg
Screenshot 2022-12-21 at 8.17.11 PM.jpg


Nvidia themselves are dominating external GPU by almost 90% as of today except iGPU. AMD and Intel cant even compete with Nvidia and therefore, Nvidia is unstoppable. CUDA is just one example. Any alternative solutions? None so far.
 

exoticSpice

Suspended
Jan 9, 2022
1,242
1,952
It is very very foolish to forgo Nvidia and AMD. Top GPUs are on PC and since "next-gen" games are 3D/UE5 based AMD is also important as PS5 and Xbox Series X are on RDNA2. Devs make 2d/3D games.

Also M1 Max which in the cheapest Mac is $2000 and has 10.4TFLOPS but the both consoles PS5 and Xbox Series X have 10.3 and 12.1 respectively.
So the M1 Max is about as powerful as a $500 PS5 in terms of GPU perfomance. Still RDNA2 has RT at least sure its not as powerful as RTX 4090 but it helps in ray traced reflections.

RE:Village plays much better a PS5/XB SX than M1 Max.

TLDR: Apple's GPUs are weak but expensive. Apple can't do 3D.

Also I don't have much hope for M2 Max it will have 14.4 TFLOPS. Only 2.3 FLOPS more than a $500 Xbox Series X and weaker than a RTX 3060 Ti.
M2 Pro/Max need hardware Ray tracing otherwise it will waste another year.

Future looks bleak.
 

jujoje

macrumors regular
May 17, 2009
247
288
Yeah a benchmark scene for lightwave might be tough if you do not have at least 2019. 2018 is about the time NewTek actually started really fixing things on the Mac side.

I was going to grab the latest trial version and give it a shot; it’s more I haven’t used lightwave since 2010 ish and have no idea what I’m doing in LW so long as there's a benchmark scene I can just press render or sth it's all good :D

Hell, without CUDA, I cant even think about working for 3D works. In

What do you use that requires CUDA? You’re just repeating the word like a mantra with no evidence as to why. It must suck to have such hard vendor lock-in you can't countenance any other hardware.
 

innerproduct

macrumors regular
Jun 21, 2021
222
353
Let’s end this confusion now.
CUDA is a compute api from nvidia. It is also what they call their computer cores. It is used in all gpu offline renderers that since it was the first general purpose GPU api.
Apple created OpenCL, open compute language, to compete and gave the spec away to the community via khronos. This api was cross platform but slightly harder to code for. Nvidia supported it but made sure that using CUDA got you better perf.
For realtime “game” rendering gpu are programmed with other apis. On pc it has mostly been direct x. The cross platform api is called OpenGL and has been around since the 90s. It was easy to use and all around great but have been showing its age when it comes to harness modern gpus. DX and OGL have not traditionally done compute but could be “misused” in order to do that anyway (by rendering to offscreen buffers). Both DX and OGL in it’s newer iterations have support for compute via compute shaders.
Finally, we have metal which is apples “Direct X”. It supports compute as well.
So, on mac you traditionally used OpenGL for games and OpenCL for compute, or CUDA, since it was supported until mojave.
These days games are very complex and uses both graphics and compute so the differentiation is not as clear cut anymore.
Oh, and finally we have Optix which is nvidias easy to use api for raytracing and only way(except vulkan) to utilize RT cores. These days all renderers are migrating to Optix from cuda.
Fianlly finallly, we have vulkan. The “new” opengl so to speak.
For ref. I have been a graphics programmer since the 90s within a scientific visualization context. My latest work had to use Vulkan to render on the cloud. It is a horrible complex api that take time to learn. metal is very easy to use in comparison but lacks features.
All in all, hope this helps the further discussions here.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,462
955
Future looks bleak.
Indeed. It is becoming clear that the Apple Silicon transition has never really been about top performance, but power efficiency.

Apple has nothing to compete with the best GPUs from AMD and nVidia. They just showed results that no one else can reproduce except with useless apps like GBXBench.
 
  • Like
Reactions: aytan

jeanlain

macrumors 68020
Mar 14, 2009
2,462
955
Perhaps you have heard of this crappy game engine called Unreal, that happens to utilize CUDA libraries for certain things that need to be calculated.
…are these things related to 3D? AFAIK, CUDA doesn't do the rasterisation.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,678
This comment alone tells me you have no idea how games are developed. And to make matters worse for you, we are not talking about gaming either, we are talking about visual effects for film/tv and scientific simulations.

Oh no, someone on the internet told me I have no idea of GPU programming. I guess I'll run to my mommy and cry 😂

Perhaps you have heard of this crappy game engine called Unreal, that happens to utilize CUDA libraries for certain things that need to be calculated. :rolleyes:

I can absolutely imagine that some games use CUDA as a specific backend for physics or some other compute stuff. First time I hear that UE uses CUDA, but very well may be, I don't work with UE. Regardless, what you do is manipulation. I was replying to the poster that claimed that CUDA is great for "3D". CUDA is a compute API. It has nothing to do with 3D. It does not expose 3D rasterisation functionality of the GPU. That's what gaming APIs are for.

Metal? METAL? Do you even know that Metal is far from competing and comparing with PC? It's just a joke to them. You cant even compare Metal to CUDA. Not even close and not even comparable.

Have you actually worked with any of these platforms? You are just blurting out things without any technical understanding. What exactly is "not even close and not even comparable"? Which features does Metal lack in comparison?

GPU performance, market share and popularity are entirely different topics. Sure, Nvidia enjoys almost unopposed hegemony in this space, thanks to their aggressive marketing strategy and ecosystem investments. There was a rapid increase of demand in GPGPU and Nvidia seized the opportunity, locking in many customers. Some folks at Apple were likely very pissed as their efforts of establishing open compute (via OpenCL) were sabotaged though incompetence, lack of foresight and of course a healthy dose of manipulation from Nvidia. Anyway, it is what it is. Doesn't mean it has to stay this way.

metal is very easy to use in comparison but lacks features.

Good overview, except this bit. Metal lacked features back in 2017. Now it's almost the other way around. We have Vulkan releasing new functionality to keep up. E.g. the recently released VK_EXT_descriptor_buffer is pretty much a direct copy Metal's Argument Buffers which have been there for several years.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
I guess you dont even use 3D related software after all. They are heavily optimized for CUDA and without it, you get extremely slow performance. That's why Nvidia is dominating the external GPU market by 80~90%. 3D market is dominated with Nvidia GPU and that's why Mac is slow for 3D works.

Metal? METAL? Do you even know that Metal is far from competing and comparing with PC? It's just a joke to them. You cant even compare Metal to CUDA. Not even close and not even comparable.
Can you show us your 3D work please?
 

jujoje

macrumors regular
May 17, 2009
247
288
Good overview, except this bit. Metal lacked features back in 2017. Now it's almost the other way around. We have Vulkan releasing new functionality to keep up. E.g. the recently released VK_EXT_descriptor_buffer is pretty much a direct copy Metal's Argument Buffers which have been there for several years.
Out of curiosity is there something that metal is significantly lacking these days? From what I can recall most of the critical ones have been addressed (but totally not an area I'm particularly familiar with).

As a side note, really curious as to how the Blender metal viewport is going to play out, in terms of parity and performance with the Vulkan viewport.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
For ref. I have been a graphics programmer since the 90s within a scientific visualization context. My latest work had to use Vulkan to render on the cloud. It is a horrible complex api that take time to learn. metal is very easy to use in comparison but lacks features.
Vulkan is so verbose that most applications are still using OpenGL and that makes application development difficult.
API.png

The Vulkan program still needs almost another full page. Around 15:15 from "Introduction to WebGPU - CIS 565 GPU Programming Fall 2022".

For example, Blender is adapting most compositor operations to the GPU, but those that require complex mathematical operations are more difficult to implement. Since Metal can do computer shaders, it is possible that the compositor in Blender has better support for the Apple GPU than for any PC GPU.

Fog Glow Glare:​

Unfortunately, we couldn’t approximate the Fog Glow option using a cheaper method. So the initial release will be without it. I know this is the one option you have been asking for, but as previously explained, we didn’t initially have high hopes for it due to how expensive it is to compute. So it requires a performant FFT implementation that we don’t have at the moment, but we are already thinking how we might approach it.

I imagine something similar will happen to other 3D programs until they adopt Vulkan or WebGPU if it eventually becomes an alternative to Vulkan.

Which 3D software has adopted Vulkan and which Metal?
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
As a side note, really curious as to how the Blender metal viewport is going to play out, in terms of parity and performance with the Vulkan viewport.
It looks like it hasn't started yet. They still rely on OpenGL.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.