Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Boidem

Suspended
Nov 16, 2022
306
245
That's what goes for a dedicated graphics card in the PC world. I know it's horrendous. But when people say they want a modular Mac Pro, they mean something with space for this.
That is madness. I though the Radeon X1900XT in my old MacPro was big! These new fangled graphics cards are enormous by comparison. I hope they last better than that X1900XT; that was the only negative of that machine.
 

RedTheReader

macrumors 6502a
Original poster
Nov 18, 2019
532
1,312
It seems people still haven't caught on after my first comment hinting at this, so I'll say it outright: the card's not nearly that ridiculous. That's a mini-ITX motherboard, a third of which is cut off from the image and the rest of which is deliberately angled away to make it look further small in comparison to the card.

It's big. It's power hungry. It's not ideal. It's also not nearly as extreme as those wearing Apple Silicon-tinted glasses would have you believe. The real issue is that it's expensive, but that's sure not to be mentioned either looking at M1U's pricing.
 
Last edited:

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
I find it interesting, all the components of a PC are getting smaller, and in the case of Apple Silicon, more integrated, except the graphics card, which is getting bigger and more power hungry. Looking at the fans and heatsinks on that 4090 you have to wonder at the heat it is dissipating. That alone is a considerable challenge for a modular Mac Pro.
 
  • Like
Reactions: maflynn

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Looking at the fans and heatsinks on that 4090 you have to wonder at the heat it is dissipating.
IT all depends on what Apple uses. I don't see Apple Silicon having a GPU that rivals a RTX 4090, the current M1 Ultra appears to be slower then an RTX 3090, look how much faster the 4090 is over the 3090.

Apple M1 Ultra gets flattened by GeForce RTX 3090 in synthetic and gaming benchmark comparison
1671360106699.png
 
  • Like
Reactions: Bodhitree

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
Indeed the desktop 3090 is quite a bit faster, and the 4090 is faster again. I think with a modular Mac Pro you have to ask the question, do you want to rival these faster NVidia cards, or do you want to invest in the software and hardware extensibility support which allows you to insert them into the machine.

The base M1 and M2 are somewhere between the PlayStation 4 and PlayStation 5 in power, which is perfectly fine for many things. They don’t hold up if you want to build a top-end 3D workstation, which is a possible application for the Mac Pro, and you have to consider the investment required if you want to mount a challenge in that market. How you handle software drivers is another major checkpoint.

For gaming I think the M1 and M2 are just about OK, with Metal 3 you will see some interesting ports appearing. Late, and not as good in visual fidelity, but I think the Mac market will be ok for casual play.
 
Dec 4, 2022
709
1,301
IT all depends on what Apple uses. I don't see Apple Silicon having a GPU that rivals a RTX 4090, the current M1 Ultra appears to be slower then an RTX 3090, look how much faster the 4090 is over the 3090.

Apple M1 Ultra gets flattened by GeForce RTX 3090 in synthetic and gaming benchmark comparison
View attachment 2129899

The raw benchmarks don’t mean much for some apps. After Effects on my M1Max runs better than my 5ghz RTX PC because the unified memory means data can fly around quicker and the app can grab more video memory. While consuming 5x less energy!
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
IT all depends on what Apple uses. I don't see Apple Silicon having a GPU that rivals a RTX 4090, the current M1 Ultra appears to be slower then an RTX 3090, look how much faster the 4090 is over the 3090.

Apple M1 Ultra gets flattened by GeForce RTX 3090 in synthetic and gaming benchmark comparison
View attachment 2129899
This is actually The Verge's work, not notebookcheck's (the article you linked is one of those lazy "here's what some other site said" things), and it's terrible. This is not surprising. One does not read The Verge for technically competent benchmarking. It's not who they are, it's not what they do.

SOTR was released years before Apple Silicon Macs. The reason Apple highlighted it during M1 launch was to reassure customers that Rosetta was so good, it could run very demanding software reasonably well. If you wanted to use it to find out what Apple Silicon is capable of, though, you'd need a native port with an optimization pass to account for the GPU's different architecture.

Despite this, lazy review sites like The Verge have latched on to SOTR as a standard benchmark of Apple Silicon performance. Apple used it, so it must be good right? But outside of the narrow context Apple used it for, it just isn't a good way to look at what Apple Silicon can do.

Similarly, on macOS, OpenCL shares the same status as OpenGL: they're on life support. Both were deprecated before Apple launched Apple Silicon Macs, and Apple isn't putting much work into making these APIs fast on Apple Silicon. They want you to move on to Metal's compute and graphics APIs.

That leaves us with GB5 Metal as the only reasonable, relevant benchmark. But there's no RTX 3090 score (there can't be), so there's just nothing to talk about here. (It's not clear whether you're meant to be able to directly compare Metal and OpenCL GB5 scores.)

Finally, I don't understand why you reflexively assume that Apple can't build something in the 4090's class. M1 Ultra is two M1 Max, and M1 Max is fundamentally a notebook chip. A very powerful one, but its thermal budget is well under 100W, and therefore M1 Ultra's is under 200W. Those numbers include CPUs (and everything else on the SoC) too, and CPUs eat about 80W out of that M1 Ultra power budget. You're contrasting these with a 450W GPU. Of course it's gonna win. Why don't you wait and see what Apple builds for a machine larger and more powerful than the Studio?
 
  • Like
Reactions: jdb8167

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
But you are going to loose all that efficiency from the M1 Max chip. We have seen with the M1 Ultra that once you make it modular, you will loose alot of performance as it is not integrated into 1 chip anymore.
 

Boidem

Suspended
Nov 16, 2022
306
245
I don't know about all the graphics cards stuff, but surely the direction computer manufacturers should be going in, is to use less energy, as global resources dwindle and wars are erupting over control of such. One of the best things about my new iMac is that it uses far, far less energy than my previous computers. That it can do everything my previous machines could, and much more, is brilliant. So surely, if Apple continue down this path, then they could create GPUs etc that use less power than other models and brands, yet with the same efficiency. I have to admit I don't really understand what all this stuff about 'benchmarks' means, in terms of actually using a computer, beyond maybe more power=less time taken to do a task. I was very impressed with how fast my iMac dealt with some audio I was cleaning up recently, so obviously even a non-nerd like me can benefit from moar pwr, but in 'the real world', how does this impact on ALL computer users? Do we all really need it? Shouldn't energy conservation be priority?
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
I don't know about all the graphics cards stuff, but surely the direction computer manufacturers should be going in, is to use less energy,
People want more performance and so CPU and GPU makers have largely increased the power consumption. You can't get blood out of a turnip, people want cutting edge performance with more and more features - that takes power.
 

Boidem

Suspended
Nov 16, 2022
306
245
People want more performance and so CPU and GPU makers have largely increased the power consumption. You can't get blood out of a turnip, people want cutting edge performance with more and more features - that takes power.
Sure, but perhaps that's because other manufs haven't been bothering to look at increasing efficiency. I don't know. But I do know my M1 iMac uses a LOT less power than previous machines I've had. My old MacPro acted as a heater when it got going properly. Great in winter; I always had warm feet. In summer, not so good. So perhaps manufs should be looking at creating the same output using less power. The current strategy is lazy and wasteful.
 
  • Like
Reactions: Bodhitree

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
It is lazy thinking by consumers, to always think faster=better, and lazy by the manufacturers to always accede to that demand. But its also not to underestimate how cinematic games are driving that process, it is the main application which gets most PC users to ask for more power, and even then just because they are pushing for 4K gaming at 120 fps or better. I remember the days when games were doing 30 fps and a resolution of 1024x768 and people still had fun with them.

Most Macs wont be heavily used as gaming machines, the games just aren’t there. So its a question of what else would you like to do that is going to use up all that power.
 

wonderings

macrumors 6502a
Nov 19, 2021
957
947
Apple trying to do modular but their own way only means the consumer is going to pay a whole heck of a lot more then they should. The old tower designs of the G5 and G4 were fantastic and allowed you to still use (while limited) normal components made for the Mac. RAM, Hard drives and GPU's (if compatible or could be flashed to work with mac OS) were pretty much standard. Apple keeps trying these non modular machines that use their proprietary hardware. I wish Apple went back to basics with the pro machines. Give me a stylish yet simple tower that lets us plug in what we need with ease and on top of that use standard hard drives and RAM. This would really help with their facade of being a green company as machines would last a lot longer before being junked.
 

Boidem

Suspended
Nov 16, 2022
306
245
Seems rather extravagant. Does that mean a Windows machine is like an IUD (a real pain to use, and not very effective…)?
 

RedTheReader

macrumors 6502a
Original poster
Nov 18, 2019
532
1,312
But outside of the narrow context Apple used it for, it just isn't a good way to look at what Apple Silicon can do.
I'm glad someone pointed this out! I was going to do the same myself, but then I realized that — or rather, remembered that — there's a reason people do this for gaming benchmarks and beyond: there isn't that much that actually can take advantage of what Apple Silicon can do.

Similarly, on macOS, OpenCL shares the same status as OpenGL: they're on life support. Both were deprecated before Apple launched Apple Silicon Macs, and Apple isn't putting much work into making these APIs fast on Apple Silicon. They want you to move on to Metal's compute and graphics APIs.That leaves us with GB5 Metal as the only reasonable, relevant benchmark. But there's no RTX 3090 score (there can't be), so there's just nothing to talk about here. (It's not clear whether you're meant to be able to directly compare Metal and OpenCL GB5 scores.)
Indeed, they've been wanting us to move on to Metal for half a decade before Apple Silicon. Developers largely weren't doing that. GB5 Metal really is the only relevant benchmark, but it's mirroring the story of software: there isn't much that really takes advantage of Apple Silicon. But you can still use other software that used OpenGL or OpenCL or Rosseta if you like MacOS… making those benchmarks relevant again.

It's one thing to say that gaming just can't be done on these systems and you should use other ones to do it. It's another to say that buckets of common productivity software that's needed by many people who'd be looking at these machines is something that just can't be used on these machines. It reminds me of when people were saying that Cinebench shouldn't be used to judge these systems. Well, guess what: Cinebench sure is relevant for people who use Cinema4D… which consists of plenty of people who'd like to use Macs.

There is no point at looking at what the systems can do if they aren't doing it. The fact that no one's being able to find a proper, fair way of comparing a 3090 or 4090 to these machines proves that.

Finally, I don't understand why you reflexively assume that Apple can't build something in the 4090's class.
Because it's the same story as all the software: it doesn't matter what the systems can do if there's no software to do that. And it doesn't matter what Apple can do if they won't do that:

https://forums.macrumors.com/thread...-but-m2-extreme-chip-likely-canceled.2374045/
Why don't you wait and see what Apple builds for a machine larger and more powerful than the Studio?
They're not doing it. That's why.

 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
Apple trying to do modular but their own way only means the consumer is going to pay a whole heck of a lot more then they should. The old tower designs of the G5 and G4 were fantastic and allowed you to still use (while limited) normal components made for the Mac. RAM, Hard drives and GPU's (if compatible or could be flashed to work with mac OS) were pretty much standard. Apple keeps trying these non modular machines that use their proprietary hardware. I wish Apple went back to basics with the pro machines. Give me a stylish yet simple tower that lets us plug in what we need with ease and on top of that use standard hard drives and RAM. This would really help with their facade of being a green company as machines would last a lot longer before being junked.

In the end, you can probably build a cheaper machine this way. If you only do video editing, you can skip on CPU and GPU power and go all in on those video accelerators.

So this allows you to create your own specialised computer for what you want do with the machine and save money by excluding what you do not need.

The only thing I'm concerned about is how much efficiency you loose through those connectors as it is not 1 chip anymore.

If Apple comes with more dedicated accelerators, a modular Mac Pro can be very interesting for certain people.
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
The raw benchmarks don’t mean much for some apps. After Effects on my M1Max runs better than my 5ghz RTX PC because the unified memory means data can fly around quicker and the app can grab more video memory. While consuming 5x less energy!
exactly, those nvidia series are an triumf especially in those benchmarks that simulate or play video games. In business and 3d modelling app that i use, the difference was way lower between my ultra systems and the 3090, gtx 3090 still ahead but around 33% in every same projects. And after a month the difference in energy was not just 33% lower on the mac system ,a lot lower so for the moment i ditch the 3090 pc's for another 2 ultra systems, time is money yes, but energy gets a lot higher on multiple systems scale over a month, not to talk over 12 months
But for me the latest rumours that mac pro will not have M extreme with 2x Ultra is a bummer, i dont see any reasons to go mac pro when the same M2 ultra it will be on mac studio as well
 

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Indeed, they've been wanting us to move on to Metal for half a decade before Apple Silicon. Developers largely weren't doing that. GB5 Metal really is the only relevant benchmark, but it's mirroring the story of software: there isn't much that really takes advantage of Apple Silicon. But you can still use other software that used OpenGL or OpenCL or Rosseta if you like MacOS… making those benchmarks relevant again.
This thread is about the future of Apple's Mac Pro platform. What's clear is that if you're one of Apple's decision makers, GL and CL are not relevant to that future at all. Apple is 100% committed to Metal. Any time you see Apple saying they think their GPU X is competitive with PC GPU Y, they're saying they think an Arm-native Metal app optimized for Apple GPUs should do as well as its equivalent optimized for GPU Y and DX12 (or whatever).

Predictions about the future of Mac Pro built on the performance of non-native software are just flawed. When you do that, you aren't looking at it the way Apple execs look at it at all. They're convinced that if they build something great, software will arrive, sooner or later.

It's one thing to say that gaming just can't be done on these systems and you should use other ones to do it. It's another to say that buckets of common productivity software that's needed by many people who'd be looking at these machines is something that just can't be used on these machines. It reminds me of when people were saying that Cinebench shouldn't be used to judge these systems. Well, guess what: Cinebench sure is relevant for people who use Cinema4D… which consists of plenty of people who'd like to use Macs.
It's not at all clear to me whether Cinebench is relevant to Cinema 4D users any more. It probably was in the past, but not that much anymore.

I'll start by noting that Cinebench R23 measures the performance of Intel Embree, an open source raytracing rendering library. But Cinema 4D is a lot more than Embree. To quote Maxon, "Cinema 4D is a professional 3D modeling, animation, simulation and rendering software solution." Someone doing 3D modeling and animation work would be far more concerned with Cinema 4D's interactive viewport performance, which definitely does not involve Embree.

But even when it comes to rendering, the picture isn't clear. You see, C4D can work with third party renderers. The built-in C4D renderer (presumably Embree) is positioned as an economy option (which is presumably why it's only Intel Embree). Maxon promotes using several third party renderers with C4D. They liked one of them, Redshift, so much that they bought the company in 2019. So now you can get your non-Embree renderer directly from Maxon.

In sum, real C4D users probably do not care about Embree performance to the exclusion of all else, and lots of them never run Embree raytracing at all. Weirdly enough, Cinebench is not a great benchmark for C4D users!

Because it's the same story as all the software: it doesn't matter what the systems can do if there's no software to do that. And it doesn't matter what Apple can do if they won't do that:

https://forums.macrumors.com/thread...-but-m2-extreme-chip-likely-canceled.2374045/

They're not doing it. That's why.
We are trusting every word Mark Gurman says as absolute truth why? He's wrong about a lot of things, he can only be as good as his leaker sources and they seem to have dried up on him in recent times.
 
  • Like
Reactions: jdb8167
Dec 4, 2022
709
1,301
exactly, those nvidia series are an triumf especially in those benchmarks that simulate or play video games. In business and 3d modelling app that i use, the difference was way lower between my ultra systems and the 3090, gtx 3090 still ahead but around 33% in every same projects. And after a month the difference in energy was not just 33% lower on the mac system ,a lot lower so for the moment i ditch the 3090 pc's for another 2 ultra systems, time is money yes, but energy gets a lot higher on multiple systems scale over a month, not to talk over 12 months
But for me the latest rumours that mac pro will not have M extreme with 2x Ultra is a bummer, i dont see any reasons to go mac pro when the same M2 ultra it will be on mac studio as well

Clock speed and other things will be better in the Mac Pro.
 

PauloSera

Suspended
Oct 12, 2022
908
1,393
Finally, I don't understand why you reflexively assume that Apple can't build something in the 4090's class. M1 Ultra is two M1 Max, and M1 Max is fundamentally a notebook chip. A very powerful one, but its thermal budget is well under 100W, and therefore M1 Ultra's is under 200W. Those numbers include CPUs (and everything else on the SoC) too, and CPUs eat about 80W out of that M1 Ultra power budget. You're contrasting these with a 450W GPU. Of course it's gonna win. Why don't you wait and see what Apple builds for a machine larger and more powerful than the Studio?
I think this is because (crazily) people assume that Apple isn't interested enough in the top 5% of Mac users to actually design such a chip. They prefer to reuse their efforts to scale the existing chip line as high as it will go, which isn't very far.

We are trusting every word Mark Gurman says as absolute truth why? He's wrong about a lot of things, he can only be as good as his leaker sources and they seem to have dried up on him in recent times.
Gurman doesn't know squat. This article that is being paraded as him reporting on a leak is him giving an opinion that boils down to "I guess since Apple hasn't released this product yet that I've been theorizing about for 2.5 years means they just aren't going to". Really scientific and insightful on his part.
 

olavsu1

macrumors regular
Jan 3, 2022
170
85
It looks cartoonish, but I’ve got to remind everyone that it’s only a smidge thicker than the 3090 from 2020. I personally can’t wait to build a system with one. Of course, the price tag will make sure that doesn’t happen, but one can dream.
that card is april fool.

 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.