Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jeanlain

macrumors 68020
Mar 14, 2009
2,462
957
OpenCL isn’t emulated, but it is a deprecated API. Apparently it’s buggy and may not be all that well optimized beyond “it works … mostly”.
I think @leman found that openCL was emulated as that there is no openCL driver for the M1. Just like openGL.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,462
957
3.45 X faster than the M1. As fast as a RTX 3070 laptop or about 15-20% less than the RTX 3080 laptop or RX 6800M. That's crazy to achieve that at only 60 watts, Apple truly created a monster!
Since GFXbench may be the Metal test that is the most favorable to Apple GPUs, I'm not sure how Apple could determine that the M1 Max was about as fast as the mobile 3080. I except it to be at least 30% slower in other tests.

And why is the M1 Max not 4x faster than the M1? It seems that everything is at least 4x better. It even has 6x the memory bandwidth. Are results constrained by latency or something?
 

crazy dave

macrumors 65816
Sep 9, 2010
1,453
1,229
I think @leman found that openCL was emulated as that there is no openCL driver for the M1. Just like openGL.

Oh? Interesting.Thanks for the info. I had thought there was a driver but it was crap :). Hard to say how much of hit you get from that, it’s probably translating commands on the CPU which by itself shouldn’t be *too* bad of a hit for a GPU-bound task (which you hope it would be) but also can depend on how optimized and performant the translation is (ie what can be accelerated in hardware vs software). So yeah could be a big hit.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
Why just 70k? The M1 scores > 20k, so one should expect ~80k.

OpenCL scores for M1 are roughly 10-15% lower, so if this score is accurate we should get ~70k in Metal.

Since GFXbench may be the Metal test that is the most favorable to Apple GPUs, I'm not sure how Apple could determine that the M1 Max was about as fast as the mobile 3080. I except it to be at least 30% slower in other tests.

These GPUs will be masters when it comes to rasterization (=gaming) performance anyway, TBDR paired with that RAM bandwidth and massive caches is something else.

Apple probably used some pro creative workflow, like rendering or video editing. Apple GPUs of the current generations cannot measure to Ampere in pure trivial number crunching tasks — Ampere has a very clear ALU advantage. But Apple GPU will excel at more complex tasks that require work synronization between multiple GPU cores or more complex memory access patterns.

And why is the M1 Max not 4x faster than the M1? It seems that everything is at least 4x better. It even has 6x the memory bandwidth. Are results constrained by latency or something?

There are multiple factors that can limit GPU scalability. One problem with GFXbench is that it's simply not demanding enough. Look at the FPS you are getting there. It's a piece of cake for these GPUs, so you lose a lot of work on various setup and synchronization tasks.
 
  • Like
Reactions: Stratus Fear

Fomalhaut

macrumors 68000
Oct 6, 2020
1,993
1,724
Might as well base it on marketing then since Metal is proprietary single-platform so you can't compare performance across cross-platform.
But you can compare application workflows across platforms, as YouTube channels like MaxTech & Alex Ziskind do all the time.

The only benchmark that matters is your own workload, whatever it is. Be it fps in a game, video rendering time, number of photos you can edit in a hour, or some application that uses GPU compute.

We will see these results in the coming weeks, and the truth shall be revealed.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,462
957
OpenCL scores for M1 are roughly 10-15% lower, so if this score is accurate we should get ~70k in Metal.
Assuming the openCL score is accurate. I find it a bit low. The M1Max is not even 2x faster than the M1 at certain sub-tests.
 

Malus120

macrumors 6502a
Jun 28, 2002
696
1,456
Since GFXbench may be the Metal test that is the most favorable to Apple GPUs, I'm not sure how Apple could determine that the M1 Max was about as fast as the mobile 3080. I except it to be at least 30% slower in other tests.

And why is the M1 Max not 4x faster than the M1? It seems that everything is at least 4x better. It even has 6x the memory bandwidth. Are results constrained by latency or something?

I think the reality is that while it's nice to have these leaks to drive discussion, the reality is, they're just that. Leaks. We don't know who did the benchmarking, the conditions under which they were run, whether the silicon was preproduction or "retail," if they used the 24C or 32C M1 Max, or even how accurate (if at all) these leaks really are.

We ALSO don't know what software Apple used to conduct its own benchmarking, making comparison to these leaks, even if the leaks are 100% accurate, somewhat dubious at best.

What we do know is: Apple was confident enough in the M1 Max to not only compare it to the 3080m but to compare it against three different laptops featuring the 3080m, provide relatively detailed (for Apple) footnotes, and to compare it against multiple different TDP configurations of said 3080m.
This isn't to say Apple isn't putting its own silicon in the best possible light. I'm absolutely sure it is. But a less confident Apple would not have given that much detail. They would've just compared against one ambiguous "high end PC laptop" without giving any details. Or they would've compared provided numbers to a generic 3070m/3080m without detailing the specific laptops /TDP configurations they were comparing against.

Now, i'm not saying this means we should take Apple's claims at face value, or that M1 Max will equal/come close to the 3080m in every, or even a majority of software. Merely that, i've watched Apple for... two decades (?) or so now. I've almost never seen them provide this level of detail on the competing products they reference. They clearly feel confident so I think the best (albeit perhaps unsatisfying) answer to your question is; we're just going to have to wait for detailed benchmarking from reputable outlets after the embargo is lifted.
 

smirking

macrumors 68040
Aug 31, 2003
3,942
4,009
Silicon Valley
OK, you all are way smarter and more educated at this on the memory bandwidth etc. BUT...
All this doesn't do me any good. Because at the end of the day why do most people buy a 3080/3090?

Well, I'm considering a Max so I can flip through real time renders of my catalog of RAW photos with as little lag as possible. A split second actually means something to me because it makes it easier for me to identify which photos to keep and which to toss. That seemingly insignificant bump in speed means my brain doesn't have to work as hard.
 

Anonymous420

macrumors newbie
Oct 11, 2021
25
3
Comparing to other workstation-class laptops, the new M1 MBPs are much better. Compared to cheap gaming laptops with poor portability, CPU, battery life, connectivity and displays, the new M1 MBP are worse.
What a suprise, the GPU in the M1 Max is worse value (gpu wise) compared to a cheap gaming laptop wich has terrible everuthing else!
 
  • Like
Reactions: Rashy

Erasmus

macrumors 68030
Jun 22, 2006
2,756
300
Australia
Forget performance per watt, what about performance per dollar?
If you want performance, you buy a windows desktop, at least until Mac Pro.

If you want performance per dollar, you buy a Windows desktop.

If you want performance per watt, you get the Mac.

If you want portable performance, you buy the Mac, because it’s the only one that you can use on your lap, not plugged into the wall, at full speed.

It’s expensive. If people don’t see it’s value, buy something else.
 
Last edited:

smirking

macrumors 68040
Aug 31, 2003
3,942
4,009
Silicon Valley
Well, I had been dithering on what to get or whether I even wanted to buy an M1x yet, but reading this thread convinced me. I just put an order in for a M1 Max 32/32/2TB. I was going to do with just an M1 Pro, but it appears that the M1 Max has some additional advantages and if this saves me some time, it'll be worthy every penny.

I know we're quibbling about whether the improved benchmarks are life changing or merely very impressive. Very impressive is already sufficient reason for me to make the leap. I test drove a 13" M1 earlier and I already know what these new processors can do. I'm looking forward to getting this machine.

I hope I'm done buying laptops for at least a good 5 years. It's not that I'm frugal. I just hate the process of migrating all of my stuff and getting my new computer ready with a passion. I like moving into a place and getting comfortable and just staying there.

I'm gonna miss this butterfly keyboard. (Just had to get that in.) :p No seriously. I will.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
Oh? Interesting.Thanks for the info. I had thought there was a driver but it was crap :). Hard to say how much of hit you get from that, it’s probably translating commands on the CPU which by itself shouldn’t be *too* bad of a hit for a GPU-bound task (which you hope it would be) but also can depend on how optimized and performant the translation is (ie what can be accelerated in hardware vs software). So yeah could be a big hit.

The layer likely translates OpenCL kernels directly into Metal IR and OpenCL API is trivially converted into Metal API calls. It should be fairly efficient overall. I have no idea how much overhead there is because of API mismatches…
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
All this doesn't do me any good. Because at the end of the day why do most people buy a 3080/3090? To PLAY games.

Nobody is buying 3080/3090. In the steam hardware survey those GPUs account for less than 1.5% of the user base. The most popular gaming GPUs actually in use currently are around 1660 level, which should be on par with the 14 core M1 Pro. Or, to put this into perspective, the base 14“ M1 pro has better gaming capability than approximately half of the PCs currently used for gaming - desktop included.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Nobody is buying 3080/3090. In the steam hardware survey those GPUs account for less than 1.5% of the user base. The most popular gaming GPUs actually in use currently are around 1660 level, which should be on par with the 14 core M1 Pro. Or, to put this into perspective, the base 14“ M1 pro has better gaming capability than approximately half of the PCs currently used for gaming - desktop included.
Not to sound too difficult, but why would game developers waste resources (and money) with high quality textures and shadows and things like ray tracing if "none" of their userbase can utilize it?
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
Not to sound too difficult, but why would game developers waste resources (and money) with high quality textures and shadows and things like ray tracing if "none" of their userbase can utilize it?

Because what sells games is hype :) I mean, gamers get exited about latest tech even if they themselves can’t afford it. And of course, there is the high-end gaming market that encompasses millions of users. It’s just that it’s a small part of the overall userbase that plays games.

This is also why I see no point in Apple targeting „gamers“. Gamers (in the narrow sence) are a very particular subculture that is notoriously resistant to change, extremely tribalist, aggressive and not very rational. Gamers are dangerous business.
 
  • Like
Reactions: Rashy

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Because what sells games is hype :) I mean, gamers get exited about latest tech even if they themselves can’t afford it. And of course, there is the high-end gaming market that encompasses millions of users. It’s just that it’s a small part of the overall userbase that plays games.

This is also why I see no point in Apple targeting „gamers“. Gamers (in the narrow sence) are a very particular subculture that is notoriously resistant to change, extremely tribalist, aggressive and not very rational. Gamers are dangerous business.
It seems to work well for Nvidia? lol (you can swap Nvidia for Sony or MS as well)

But I do understand your point (sadly I might add).
 

leman

macrumors Core
Oct 14, 2008
19,521
19,679
It seems to work well for Nvidia? lol (you can swap Nvidia for Sony or MS as well)

Nvidia has been doing it for a very long time and even they have diversified into supercomputer and mAchime learning market… MS was smart back in the day and rolled their own proprietary API (something Apple is being criticized for today). For MS, it was the right choice.
 

Anonymous420

macrumors newbie
Oct 11, 2021
25
3
If you want performance, you buy a windows desktop, at least until Mac Pro.

If you want performance per dollar, you buy a Windows desktop.

If you want performance per watt, you get the Mac.

If you want portable performance, you buy the Mac, because it’s the only one that you can use on your lap, not plugged into the wall, at full speed.

It’s expensive. If people don’t see it’s value, buy something else.
Thanks :)
 

smirking

macrumors 68040
Aug 31, 2003
3,942
4,009
Silicon Valley
If you want performance per watt, you get the Mac.

If you want portable performance, you buy the Mac, because it’s the only one that you can use on your lap, not plugged into the wall, at full speed.

I have a 2018 i7 that's meeting my needs, but these two reasons are why I'm upending my usual habit of upgrading nor more than once every 4 years. My Intel Mac is portable enough to bring anywhere, but I always need to be near a power source or I can't work on it for long. Its advertised 11 hour battery life is more like 2-4 for me depending on what I'm doing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.