Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Agree and disagree. First of all you are forgetting the largest market for games, which is the youth. Your old ass is not the priority at all as a market demographic. Another thing you don't count for is that the younger generations (that are affluent enough for AAA game) have a strong preference towards Apple products.
Apple as a company has plenty of gaming market share to gobble up if they truly are getting serious. Gaming is a growing industry and will continue to grow, and fact is that there's more than enough space for everyone to eat.
But incentive is indeed necessary and Apple would need to embrace and partner with 3rd party providers such as Steam and Epic on the platform, in order for this to be a convincing foray into the world of gaming.

There are currently a huge number of MacOS users and true most of them aren't gamers but I bet if some key multiplayer games launched on the platform it would still make for nice gaming on the side, and once there is a decent multiplayer segment then the single player games will naturally be bolstered because there are more gamers using the platform. Yes it will take time but it is most definitely not dead before it begins as you stated.
Sounds like you are betting the Apple AAA gaming farm on “young” people. The same ones that have been moving back home in record numbers due to economic factors, or hoping to get student loans written off, worried about employment prospects. They are going to buy a ****ton of Macs, and enough of those are going to play AAA games on their Macs to make it worth developers’ investment. You know what I mean because you felt it necessary to throw in the affluence disclaimer in your response, yet you think the number of Mac AAA players will be enough. Good luck with that.

And let’s not forget that chances are that any of those really young people who are predisposed to AAA gaming likely got into it with a console or home PC (thanks, “old” mom/dad) years before they get in an economic position to drop $2400+ on a Mac suited to AAA games as an adult. They already know they can spend $600-$800 on a console, peripherals and a couple of games, party up with their gamer friends and enjoy a high-end gaming experience.

While the Apple mobile gaming picture is good, AAA gaming on a Mac is still an afterthought to Apple, and the likelihood of future AAA gamers seeing Mac as their first choice effectively zero without liberal application of fairy dust.

Finally, my “old ass” has been around every other time Apple has hinted they want to get into gaming, yet here we are. We are commenting on a thread about a game because it is apparently significant that this game just became available on Mac nine months after it was released on consoles and PC. Pathetic.
 
That's the binned chip.
Doesn't change the fact and full chip doesn't really run faster. Someone tested M3 Max with full chip and it was still perform as similar as RTX 4060. Still far from mobile RTX 4080 and Yet, you still failed to prove anything.
 
Last edited by a moderator:
Would be interesting to see the fps on the windows laptops when running on battery.
That's pointless. Even Apple Silicon Mac consume too much of power in 1 hour and the fan run fast. It requires a fast charger in order to play RE4. In less than 2 hours, the battery will die.
 
Your words: "Cinebench R24 shows different result which is more reliable."

If you don't feel that is important then why share that?

I just showed the earlier version of the benchmark from another source has things very close (extrapolated from the 4090 result) between the 4080 and M3 Max.

I'm not saying this is conclusive, these are just two sources and they're highly divergent. I'd prefer to understand instead of outright calling you a troll, but hey, you went there. It seems like you've already made up your mind regardless and don't really care about data.

I already addressed what you claim about BG3, but then again, you provided no source that backs up the claim anyway so I'm not even sure to what degree that is even true.

In any case, I'm done here. Clearly nothing is to come out of this and you're just wasting my time.
I have a loaded M3 Max MBP 16 (40 GPU/64gb ram) and yes, the FPS does come close to my Asus G18 4080 in native games like BG3, but the graphics fidelity isn't the same. Even on max settings, the draw distance, lighting, shadow & water quality etc, is still notably worse than max settings on a 4080. It's roughly about the same as BG3 medium settings on the Windows laptop. The mac version also has irregular frametimes and is more prone to stutter. I really wish the youtubers would compare graphics quality with frame consistency and smoothness, not just frames.
 
Keep in mind the quoted post used one example with a binned chip (infer a ~30% improvement over that) and another targeted @ 60fps w/ a prerelease build.

We probably have a couple generations before the top Max offering is at parity w/ the top laptop Nvidia GPU... probably the first 2nm offering. It's clear Apple is catching up quickly despite the massive efficiency disparity.

The bigger issue is of course getting developers on board w/ high quality ports/releases.
It is a moving target. The 4XXX laptop chips are being fabbed on an inferior node vs. the M3. (Along with just a 10nm Intel CPU, but that's another story) I'm pretty sure Nvidia is going to be moving up TSMC's nodes with their next gen. So not sure if they're catching up anytime soon. My M3 Max still lasts much longer than my 4080 laptop. Surprisingly, if you undervolt and adjust the fan profiles on my Asus G18, I can get it to run about as quiet as the M3 Max MBP 16 when gaming. The battery still sucks though.
 
Doesn't change the fact and full chip doesn't really run faster. Someone tested M3 Max with full chip and it was still perform as similar as RTX 4060. Still far from mobile RTX 4080 and Yet, you still failed to prove anything.

Please provide the comparison with the full M3 Max w/ uncapped frame rates. It doesn't seem one exists on YouTube or elsewhere that I can find. So far you've repeatedly made your case with compromised data and results.

I have a loaded M3 Max MBP 16 (40 GPU/64gb ram) and yes, the FPS does come close to my Asus G18 4080 in native games like BG3, but the graphics fidelity isn't the same. Even on max settings, the draw distance, lighting, shadow & water quality etc, is still notably worse than max settings on a 4080. It's roughly about the same as BG3 medium settings on the Windows laptop. The mac version also has irregular frametimes and is more prone to stutter. I really wish the youtubers would compare graphics quality with frame consistency and smoothness, not just frames.

What about some of the comments further up in this thread regarding optimization of the port lacking? I don't disagree that quality is going to be compromised for many of these initial efforts but that's more a development thing than a capability thing, no, and can vary from game to game?
 
Last edited:
  • Disagree
Reactions: sunny5
It is a moving target. The 4XXX laptop chips are being fabbed on an inferior node vs. the M3. (Along with just a 10nm Intel CPU, but that's another story) I'm pretty sure Nvidia is going to be moving up TSMC's nodes with their next gen. So not sure if they're catching up anytime soon. My M3 Max still lasts much longer than my 4080 laptop. Surprisingly, if you undervolt and adjust the fan profiles on my Asus G18, I can get it to run about as quiet as the M3 Max MBP 16 when gaming. The battery still sucks though.

I guess that's fair. It appears the 3080 -> 4080 is about a 50% improvement over 2 years, and the M1 Max -> M3 Max experiences similar.

Do you feel there will be a bigger push/innovation in the next few years due to LLM inference needs?
 
I just provided the actual figures with calculations. I provided the source used against the source you provided.

Until you can do the same then it's safe to assume you're not approaching this discussion in good faith and really have nothing constructive to add nor can be taken seriously.


People always cherrypick the worst result to prove a point. In the same video you can cleary see that M3 Max is on par with gaming CPUs and GPUs like AMD 7950X/7800X3D and Nvidia RTX 4090 in Baldur’s Gate 3 at 1080p. It’s also as fast as Alienware M18 with Intel i9-13900HX and RTX 4080 at 1440p.

Skärmavbild 2023-12-21 kl. 16.46.50.png
Skärmavbild 2023-12-21 kl. 16.47.55.png



1080p is the most common resolution used by gamers on Steam with 60%. 1440p is the second most used resolution with 16%. 2160p/4K is used by only 3.72%. So about 90% of the gamers use a resolution of 1440p or lower. So where it matters M3 Max can as you say be as good as 4080 or even 4090 in games.

Skärmavbild 2023-12-21 kl. 16.49.20.png
 
Last edited:
  • Like
  • Disagree
Reactions: sunny5 and Beau10
Don't bother to discuss with that person. As you've already seen they start every conversation with Macs/Apple GPUs suck. They're pretty known for this across the forum and of course they always cherrypick the worst result to prove their point.

In the same video you can cleary see that M3 Max is on par with gaming CPUs and GPUs like AMD 7950X/7800X3D and Nvidia RTX 4090 in Baldur’s Gate 3 at 1080p. It’s also as fast as Alienware M18 with Intel i9-13900HX and RTX 4080 at 1440p.

View attachment 2327479View attachment 2327503


1080p is the most common resolution used by gamers on Steam with 60%. 1440p is the second most used resolution with 16%. 2160p/4K is used by only 3.72%. So about 90% of the gamers use a resolution of 1440p or lower. So where it matters M3 Max can as you say be as good as 4080 or even 4090 in games.

View attachment 2327504

I have both a M3 Max and a Asus G18 4080 running BG3. As I eluded to earlier, the M3 Max may match the mobile 4080 in FPS, but that's were it ends. The 4080 has more detailed rendering and the draw distance blows away the Apple Silicon port of BG3. The frame times are also much more consistent which is a big issue for me with the M3Max port as it's not as smooth. This may be just poor decision on Larian to cut the MacOS development team before they could fully optimize the port. If I lower the BG3 settings to medium on my 4080 which is close to the max detail setting on the mac version, the FPS jumps to over 140 on the 4080. Apple should do something like freesync/gsync on their macbooks as it would definitely improve the experience.
 
Interesting, I'm not sure that a 4090 owner is going to game at 1080p (especially in BG3). Look at how low the GPU usage is in that picture you posted.

Of course not. That's a CPU test and the reason why they chose 1080p but that's also the resolution most players use and even at 1440p it's as fast as 4080 mobile.
 
Of course not. That's a CPU test and the reason why they chose 1080p but that's also the resolution most players use and even at 1440p it's as fast as 4080 mobile.
It is curious that no one is doing comparisons of RE4 on macOS and PC.
 
I suppose there may be any number of benchmarks out there, but according to Geekbench the M3 Max barely edges out a laptop 3060. Even if you look at their Mac Benchmarks (OpenCL), the 40 core M3 Max just edges out a desktop 3060. Curious to see what cross-platform benchmark results you found.

Maybe you are confused. Mac doesn't use Open CL, they have their graphic system called Apple Metal. and the comparisons between apps optimized to run on Apple Metal against Nvidia, got different results than you are saying. Again you are creating confusion, maybe cause you are confused.
 
  • Like
Reactions: Homy
Maybe you are confused. Mac doesn't use Open CL, they have their graphic system called Apple Metal. and the comparisons between apps optimized to run on Apple Metal against Nvidia, got different results than you are saying. Again you are creating confusion, maybe cause you are confused.

Yes, OpenCL ande OpenGL was deprecated over five years ago at WWDC 2018 with Mojave 10.14 and people still use it to make a point.

 
Maybe you are confused. Mac doesn't use Open CL, they have their graphic system called Apple Metal. and the comparisons between apps optimized to run on Apple Metal against Nvidia, got different results than you are saying. Again you are creating confusion, maybe cause you are confused.
Perhaps you are confused about the purpose of the citation. The discussion at the time was about comparisons between Apple silicon and NVidia GPUs. While you were busy telling others they were confused or creating confusion (I think I see a trend there), I looked for some kind of a benchmark and found Geekbench.

We all know about the Metal API. Geekbench tests using both, OpenCL and Metal on Macs. The only available Geekbench results that apply to both Macs and Nvidia are the OpenCL ones because that API is cross platform. Geekbench Metal results can only be used to compare literally “Apples to Apples”. I pointed to the OpenCL Mac chart because those are the only numbers that can reasonably be compared to the Nvidia numbers.

Now, is it likely that at this point the Metal API is better optimized than OpenCL on Macs? Sure. In the end these discussions comparing different architectures will still come down to hearsay without a definitive, fair cross-platform benchmark and it doesn’t look like we’ll ever get one since Metal is the way forward for Apple.

Hope this helps.
 
I looked for some kind of a benchmark and found Geekbench. We all know about the Metal API. Geekbench tests using both, OpenCL and Metal on Macs. The only available Geekbench results that apply to both Macs and Nvidia are the OpenCL ones because that API is cross platform. Geekbench Metal results can only be used to compare literally “Apples to Apples”. I pointed to the OpenCL Mac chart because those are the only numbers that can reasonably be compared to the Nvidia numbers.

Now, is it likely that at this point the Metal API is better optimized than OpenCL on Macs? Sure. In the end these discussions comparing different architectures will still come down to hearsay without a definitive, fair cross-platform benchmark and it doesn’t look like we’ll ever get one since Metal is the way forward for Apple.

While you’re trying to act innocent your answers are disingenuous. You wrote ”I suppose there may be any number of benchmarks out there” but still you chose the one with outdated results. There's nothing reasonable about comparing an old deprecated API on Mac like OpenCL/OpenGL with the latest version on Windows. The latest OpenCL is on version 3.0.14 on Windows but 1.2 on macOS.

You could have chosen cross-platform tools like GFXBench and 3D Mark which use Metal where the results are comparable with Windows but you didn’t and you were clear to make a strong statement about M3 Max ”barely edges out a laptop with 3060”. Someone who claims to know about Metal shouldn’t even make such a misleading comparison and keep claiming ”comparing different architectures will still come down to hearsay without a definitive, fair cross-platform benchmark and it doesn’t look like we’ll ever get one.”

If you didn’t even know about any other benchmark tools, since you wrote ”I looked for some kind of a benchmark and found Geekbench” you shouldn’t make such certain conclusions.
 
While you’re trying to act innocent your answers are disingenuous. You wrote ”I suppose there may be any number of benchmarks out there” but still you chose the one with outdated results. There's nothing reasonable about comparing an old deprecated API on Mac like OpenCL/OpenGL with the latest version on Windows. The latest OpenCL is on version 3.0.14 on Windows but 1.2 on macOS.

You could have chosen cross-platform tools like GFXBench and 3D Mark which use Metal where the results are comparable with Windows but you didn’t and you were clear to make a strong statement about M3 Max ”barely edges out a laptop with 3060”. Someone who claims to know about Metal shouldn’t even make such a misleading comparison and keep claiming ”comparing different architectures will still come down to hearsay without a definitive, fair cross-platform benchmark and it doesn’t look like we’ll ever get one.”

If you didn’t even know about any other benchmark tools, since you wrote ”I looked for some kind of a benchmark and found Geekbench” you shouldn’t make such certain conclusions.
That was not my “conclusion”, as you say. In that benchmark, the 3060 comparison was correct based on the numbers. Maybe you missed the part where I did ask for other benchmarks for my edification in the Geekbench post (#58) Later in the thread someone posted an article pointing to GFXbench. Maybe you also missed the post where I took the time to go look at it, and the comparison did not look favorable to the M3 there either, and again I asked for feedback on what I saw (#66) at the GFXBench site (I even provided a link).

If you had read any of my other posts (#5, for example) you’d realize I don’t even have a dog in the GPU fight because I stopped gaming on Macs years ago and I’m not likely to go back. Doesn’t mean I am not curious about the state of the technology. Be mad if you want. I’m not. I don’t have to wait to play the latest games with friends.

edit: wrong post #
 
Last edited:
That was not my “conclusion”, as you say. In that benchmark, the 3060 comparison was correct based on the numbers. Maybe you missed the part where I did ask for other benchmarks for my edification in the Geekbench post (#58) Later in the thread someone posted an article pointing to GFXbench. Maybe you also missed the post where I took the time to go look at it, and the comparison did not look favorable to the M3 there either, and again I asked for feedback on what I saw (#66) at the GFXBench site (I even provided a link).

If you had read any of my other posts (#5, for example) you’d realize I don’t even have a dog in the GPU fight because I stopped gaming on Macs years ago and I’m not likely to go back. Doesn’t mean I am not curious about the state of the technology. Be mad if you want. I’m not. I don’t have to wait to play the latest games with friends.

edit: wrong post #

I’m not mad. I thought it was strange to see someone who’s been on Macrumors for 19 years not knowing about right benchmarking tools for Mac but if you left Mac gaming years ago maybe it’s not that strange. At the same time you say you’ve been curious about the state of Mac technology so it’s still strange that you haven’t heard of other tools all these years.

Anyway I saw your comments about GFXBench and have to say you’re looking at it the wrong way. That comparison feature is not quite accurate because it’s missing some details. When you compare GPUs you always have to look at the offscreen results for a correct comparison. Offscreen tests only benchmark the GPU alone regardless of what specs your monitor/screen has. That’s the true measurement. Onscreen tests are affected by the screen resolution and refresh rate among other things. That can cause confusion and inaccuracy. As Arstechnica writes

”GFXBench offers two different types of tests: "onscreen" tests and "offscreen" tests. There has been some confusion about the two in the past, but the difference should be fairly easy to understand. Onscreen tests run at the native resolution of the device's display panel, which tells us how good a given GPU is at driving graphics on a particular display. If you have one laptop with a 1080p screen and one with a 4K screen and both are using the same model of GPU, the 4K system is going to score significantly lower in the onscreen tests because that GPU is pushing more pixels. Offscreen tests render the scenes at 1080p on every device regardless of the screen's resolution, which puts all the GPUs on even footing so that we can definitively say "all else being equal, GPU X is better than GPU Y."

You can also see this when Anandtech compares phones.

Here is my own comparison. I have a Mac Studio M1 Max with external monitor. I get around 60 fps Onscreen at 60Hz. If I change the refresh rate to 30 I get only 30 fps Onscreen.

Skärmavbild 2023-12-22 kl. 03.06.35.png
Skärmavbild 2023-12-22 kl. 03.10.20.png


What’s not showing in your comparison is the screen resolution of the laptops. M3 Max has 3456 x 2234 while 4080 has 2560 x 1600. The MacBook screen has almost double the resolution (x1.885).

Skärmavbild 2023-12-22 kl. 02.41.32.png
Skärmavbild 2023-12-22 kl. 02.40.02.png


Another result missing in that comparison is the scores for 4K Aztech Ruins High Tier Offscreen. Both M3 Max and 4080 mobile get the exact same result of 200 fps with M3 Max being 0.6 fps faster. Even in Aztech Ruins High Tier Offscreen M3 Max is close to 4080 with 440 vs 464. So Notebookcheck is not in fact ”leaving out the details in order to claim performance is close”.

Skärmavbild 2023-12-22 kl. 03.35.21.png
Skärmavbild 2023-12-22 kl. 03.35.08.png
 
  • Like
Reactions: Beau10
Please provide the comparison with the full M3 Max w/ uncapped frame rates. It doesn't seem one exists on YouTube or elsewhere that I can find. So far you've repeatedly made your case with compromised data and results.

That ASUS ROG Strix G16 4060 has several lower settings than the base M3 Max 30c.

Skärmavbild 2023-12-22 kl. 04.49.26.png
Skärmavbild 2023-12-22 kl. 04.49.37.png
Skärmavbild 2023-12-22 kl. 04.54.09.png
Skärmavbild 2023-12-22 kl. 04.53.44.png
 
  • Like
Reactions: Beau10
Not one of the PC gaming laptops has speakers in the same class as the MacBook Pro nor the ability to play games and actually hear the sound over the RTX 4080 fan noise as it try’s to cool. Add in a 13xxx cpu and you NEED headphones. worth sacrificing a few fps for quiet gameplay And superior audio.
 
That ASUS ROG Strix G16 4060 has several lower settings than the base M3 Max 30c.

Some results posted here: "With every single setting maxed out and running at native res on the 16 inch MBP, I'm getting a solid 90fps"

Thank you for piping in and presenting what appears to be a good deal of reasoning and solid data. I started with an open mind that some of the more optimistic benchmarks were too flawed to be considered but enough bad data and misunderstandings have been presented from the pessimistic viewpoint that it's hard to take seriously.

Yes, many games will be poorly optimized and buying a Mac right now for the purpose of gaming is perhaps not the best idea, no argument there. But the raw power is there and the situation is good enough to have a great experience with some landmark titles. Things can only go up from here. Playing 2077 w/ crossover and getting 60-70fps @ 2056*1329 w/ most things maxed which is wild to me. Currently downloading RE4 and looking forward to trying it out later today.

One other interesting datapoint regarding GPU power - my M3 Max is on par w/ desktop 4090 LLM inference speed, which is definitely surprising. It's not a question of being RAM limited either, even 7G models run at the same ~40 tok/sec speed sustained.
 
Last edited:
  • Like
  • Haha
Reactions: sunny5 and Homy
Some results posted here: "With every single setting maxed out and running at native res on the 16 inch MBP, I'm getting a solid 90fps"

Thank you for piping in and presenting what appears to be a good deal of reasoning and solid data. I started with an open mind that some of the more optimistic benchmarks were too flawed to be considered but enough bad data and misunderstandings have been presented from the pessimistic viewpoint that it's hard to take seriously.

Yes, many games will be poorly optimized and buying a Mac right now for the purpose of gaming is perhaps not the best idea, no argument there. But the raw power is there and the situation is good enough to have a great experience with some landmark titles. Things can only go up from here. Playing 2077 w/ crossover and getting 60-70fps @ 2056*1329 w/ most things maxed which is wild to me. Currently downloading RE4 and looking forward to trying it out later today.

One other interesting datapoint regarding GPU power - my M3 Max is on par w/ desktop 4090 LLM inference speed, which is definitely surprising. It's not a question of being RAM limited either, even 7G models run at the same ~40 tok/sec speed sustained.
That thread is confusing, does RE4 come with MetalFX turned on by default? I saw only 1 post that mentioned they got nearly half the framerate with MetalFX off.
 
I am not 100% sure but I remember having to toggle “MetalFX Upscaling” from its default being off.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.