Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

LoopsOfFury

macrumors member
Original poster
Sep 12, 2015
56
91
California
In an effort to add actual real-world performance data to the debate on whether the M1 Max is any good for gaming, here’s what I just experienced with Eve Online (the premiere spaceship MMO). Eve just received its first native Mac client a few weeks ago (previous iterations relied on WINE).

Unfortunately, it’s basically impossible to benchmark a PVP focused game in a properly controlled environment, and sitting alone in space is entirely unrepresentative of the game. My solution was to note the typical frame rates I saw while circling the busiest trade hub in the game. They were relatively stable, but could fluctuate by 10%+ up or down, depending on the circumstances.

Everything was done on the same 4K @ 144Hz monitor at two resolutions: 3840 x 2160 and 2560 x 1440. I only spent a few minutes on the tests to avoid throttling from the MacBook Air (I don’t think it would be a problem with the Pro). The first number is with antialiasing set to high, the second is with it turned off.

M1 (8 core GPU) - MBA w/16GB RAM
4K: 16 / 22 fps (AA on / off)
2.5K: 28 / 35 fps (AA on / off)

M1 Max (32 core GPU) - 16” MBP w/32GB RAM
4K: 70 / 100 fps (AA on / off)
2.5K: 115 / 120 fps (AA on / off)

RTX 3080 Ti (desktop)
4K: 140 / 140 fps (AA on / off)
2.5K: 150 / 150 fps (AA on / off)

You can see antialiasing has a large impact on the M1 (especially at 4K). I don’t know if it’s a coding, metal, driver, hardware or other kind of issue. Regardless, the 32 core M1 GPU seems to scale pretty well vs. the 8 core M1 GPU, and (at least with AA turned off) should be able to compare decently vs. a mobile RTX 3080 in something like a Razer Blade.

Speaking of which, the 16” M1 Max is way quieter than the 15” Razer Blade Advanced (RTX 3080) I just sold a few weeks ago.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Thanks for these tests! Not bad at all. This again illustrates that these GPUs are beasty rasterizers.

You can see antialiasing has a large impact on the M1 (especially at 4K). I don’t know if it’s a coding, metal, driver, hardware or other kind of issue.

This is interesting since AA is generally very cheap on TBDR hardware. It will also depend on how AA is implemented in Eve. Even more surprising that the NVIDIA GPU does not show any kind of performance difference here.
 
  • Like
Reactions: NC12

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
Not bad indeed, for a game coded in Vulkan (moltenVK) and which may not have specific optimisations for Apple GPUs (I'm not sure how should could achieve that with Vulkan).
 
  • Like
Reactions: brainkilla

crazy dave

macrumors 65816
Sep 9, 2010
1,454
1,229
I know it's the case for MSAA. Is FXAA or SMAA cheaper on a TBDR GPU than on an IMR GPU?

If I remember right, yes. But something like DLSS would be more expensive (obviously DLSS has a scope beyond improving jaggies).

Edit: I stand corrected about FXAA and SMAA
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
I know it's the case for MSAA. Is FXAA or SMAA cheaper on a TBDR GPU than on an IMR GPU?

Those are all image-space AA methods that require you to apply filters and adaptive blur to the rendered scene. TBDR doesn't have anything to help with this. I suppose you can use tile shading to run AA on the tile in the on-GPU memory, but you will probably get artifacts at the tile boundary...
 

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
My point is, do we know what type of AA Eve Online uses? I've looked at screenshots, and the settings just say "Anti Aliasing".

Also, why is the impact of AA is much greater at 4k than at 2.5k? I would have thought that the game is GPU-bound in both situations.
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
So bottom line is that 32 core gpu is close to the 3080 desktop?!
Or more than half of the desktop 3080 performance?!
 
  • Like
Reactions: lespommes

Pressure

macrumors 603
May 30, 2006
5,182
1,545
Denmark
So bottom line is that 32 core gpu is close to the 3080 desktop?!
Or more than half of the desktop 3080 performance?!
We only know what GPU was used, not the rest of the computer.

The RTX 3080 Ti could be bottlenecked by the processor at 2.5K resolution since the performance difference at 4K is quite little.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
So bottom line is that 32 core gpu is close to the 3080 desktop?!
Or more than half of the desktop 3080 performance?!

In this particular test, yes. And it's not a 3080, it's a 3080 Ti (which is basically a tiny bit slower RTX 3090). Although I suspect that the GPU is bottlenecked by the CPU (as the performance at 2.5k and 4k are almost identical).

Edit: @Pressure was faster :)
 
  • Like
Reactions: ikir

urtules

macrumors 6502
Jul 30, 2010
341
369
Thanks, very interesting test. I thought that it's better to turn off AA when running in native resolution. AA hides pixel staircases, but pixels are very small on Retina so you won't see the staircase effect.
 
  • Like
Reactions: brainkilla

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
In this particular test, yes. And it's not a 3080, it's a 3080 Ti (which is basically a tiny bit slower RTX 3090). Although I suspect that the GPU is bottlenecked by the CPU (as the performance at 2.5k and 4k are almost identical).
The fact that AA has no impact at all on performance when running the game on the PC strongly suggests that the GPU is not saturated.
So we cannot conclude anything WRT the M1 Max GPU versus the RTX GPU.
This game is simply not demanding enough to test an RTX 3080 Ti, even at 4k.
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
We only know what GPU was used, not the rest of the computer.

The RTX 3080 Ti could be bottlenecked by the processor at 2.5K resolution since the performance difference at 4K is quite little.
CPU is an i9-11900K..so its impressive
 
  • Like
Reactions: ikir

Serban55

Suspended
Oct 18, 2020
2,153
4,344
In this particular test, yes. And it's not a 3080, it's a 3080 Ti (which is basically a tiny bit slower RTX 3090). Although I suspect that the GPU is bottlenecked by the CPU (as the performance at 2.5k and 4k are almost identical).

Edit: @Pressure was faster :)
the cpu is CPU is an i9-11900K...and again, this is the 3d game after baldurus gate 3 and WoW shadow lands that place this on at least same level of mobile 3070 nvidia..its impressive for me
Not to mention how Maya works now comparing with my former 16" but thats something different...so even in games this performs very well...i wonder if this is all toped at 55-60W from gpu itself
 
  • Like
Reactions: ikir

Serban55

Suspended
Oct 18, 2020
2,153
4,344
The fact that AA has no impact at all on performance when running the game on the PC strongly suggests that the GPU is not saturated.
So we cannot conclude anything WRT the M1 Max GPU versus the RTX GPU.
This game is simply not demanding enough to test an RTX 3080 Ti, even at 4k.
if this is not demangind enough why is not running at the full 144fps then? and its max out at 140fps in 4k ?
In eve is 140fps 4k while a real non demanding game like csgo no matter what it reach 4k at over 210 fps (without a cap)
 

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
if this is not demangind enough why is not running at the full 144fps then? and its max out at 140fps in 4k ?
Because the CPU cannot compute more frames.
If Vsync is off, the GPU may even go further than 144 fps. The results even show 150 fps at 2.5k.
(EDIT: on the Mac, Vsync may be enforced, so that could be a factor here. It'd be interesting to test at very low resolution. I'm not sure that Vsync can be disabled on Apple GPUs using Metal).

EDIT: a way to evaluate whether a game is limited by GPU performance is checking whether settings that only affect GPU performance change frame rates. These settings typically are:
- screen resolution, first and foremost. Increasing resolution does not incur more work on the CPU, only on the GPU.
- Anti-aliasing, which is AFAIK only done by the GPU.
This may be the case for other settings like AF and tessellation, but I'm not sure.

On this case, tripling the number of pixels from 2.5k to 4k has very little impact on performance and AA has none.
 
Last edited:

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Increasing resolution does not incur more work on the CPU, only on the GPU.
Could it be possible that increased resolution would result in larger textures being used, resulting in the CPU pushing more data over to the GPU?
 

Serban55

Suspended
Oct 18, 2020
2,153
4,344
Because the CPU cannot compute more frames.
If Vsync is off, the GPU may even go further than 144 fps. The results even show 150 fps at 2.5k.
(EDIT: on the Mac, Vsync may be enforced, so that could be a factor here. It'd be interesting to test at very low resolution. I'm not sure that Vsync can be disabled on Apple GPUs using Metal).

EDIT: a way to evaluate whether a game is limited by GPU performance is checking whether settings that only affect GPU performance change frame rates. These settings typically are:
- screen resolution, first and foremost. Increasing resolution does not incur more work on the CPU, only on the GPU.
- Anti-aliasing, which is AFAIK only done by the GPU.
This may be the case for other settings like AF and tessellation, but I'm not sure.

On this case, tripling the number of pixels from 2.5k to 4k has very little impact on performance and AA has none.
Eve online is not like starcraft...its gpu hungry and not cpu hungry
having an 8cores i9-11900K is more than enough
Again..if in pc gaming , an i9-11900K is not enough...then pc gaming succks keeping in mind that most games are build around the gpu and not cpu
But again, im not here to dispute, the result are on par with the other 2 native games that are on the same level as the nvidia mobile 3070.
So apple didnt lied, of course they took their best performance from a specific task and placed it to the chart
 
Last edited:
  • Like
Reactions: ikir

leman

macrumors Core
Oct 14, 2008
19,522
19,679
The fact that AA has no impact at all on performance when running the game on the PC strongly suggests that the GPU is not saturated.
So we cannot conclude anything WRT the M1 Max GPU versus the RTX GPU.
This game is simply not demanding enough to test an RTX 3080 Ti, even at 4k.

That is true. But for all practical purposes, I guess the conclusion is that M1 can play Eve just fine :)
 
  • Like
Reactions: Serban55

Serban55

Suspended
Oct 18, 2020
2,153
4,344
Bottom line is that for proper games that takes adv of Apples hardware this gpu is impressive for 55-60W
Have a nice day to all of you, going to play with Maya
 
  • Like
Reactions: rezwits

Serban55

Suspended
Oct 18, 2020
2,153
4,344
Look, I'm just concluding from the results, not from a theoretical standpoint. I don't even need to know what game it is on what config. A mere 7% decrease in fps going from 2.5K to 4k pretty much says it all. The game is not limited by the GPU at those settings.

BTW: apparently Vsync cannot be removed on Eve Online on Monterey:
https://www.reddit.com/r/macgaming/comments/qh26yg
my 16" M1 max shipped with Monterey :(((
Im out of here
 

jeanlain

macrumors 68020
Mar 14, 2009
2,463
958
Metal generally enforces Vsync. I think it can be disabled on certain conditions, but I'm not even sure it's possible to disable Vsync with Metal on Apple GPUs. AFAIK, it's impossible on iPhone and iPads, but I could be wrong.

Who needs to disable it anyway? Screen tearing is just awful and with Promotion, there is really no point in disabling Vsync. It's just useful for benchmarking. But recent built-in benchmark tools have means to estimate frame rates with Vsync on.
 
  • Like
Reactions: ikir and MacKid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.