Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
any intel mac can benfit from egpu since TB3, so they are gaming very capable
Sure but just because you can doesn’t mean the majority of Mac users are buying eGPU’s on top of their expensive MacBooks just so they can play games. My whole point is buy n large the majority of Mac users have not had access to hardware that was capable of playing modern AAA games due to Intel’s integrated graphics. Whether they *could* via an eGPU is irrelevant to what people actually have done.
 

VArase

macrumors regular
Feb 1, 2008
114
60
Chicagoland
I disagree. The current macs lack the 8.6 pound combination of cheap polycarbonate plastics, color changing LEDs around the venting and on the lid, Transformer-like industrial design, letter typefaces on the keys that appear to be done by a freshman design student, and various decals - such as a dragon, a cool Hotwheels-like flame, or everybody's favorite - who wouldn't want "Predator" written on their computer?

Fact is, and it's been this way since day one and is NEVER going to change. If you want a gaming PC, go Windows. If want to game at home on a Mac, the best solution is purchasing a $300 Xbox Series S. If you want to game on the go, get the $250 Switch and bring it along with you. If you want to game occasionally but the primary reason you have a device is to make money or just do computer things, Macs will do just fine.

Macs have never, will never, and can never be as capable at gaming as those three options.
Oh, I dunno.

The reason I plan to keep my 2020 iMac 5K, core-i9 (10900), Radeon Pro 5700 XT 16 GB, 128 GB RAM, 10 gb ethernet, 4 TB SSD is because it games pretty well and gets my work done - I can run Mass Effect Legendary Edition with 4K assets at 5K resolution booted into Windows.

I'm also betting that when I get my M1x 16" MacBook Pro, 32 GB RAM, 32 core GPU, 2 TB SSD I'll have the ability to run ARM Win under Parallels and game x86 at full resolution and pretty much full speed using Window's on-the-fly x86 -> ARM code translator (like macOS's Rosetta 2).

And both machines should run AAA game ports pretty well too, and since new ports should start showing up as universal binaries - once Steam gets off their tush and produces a Apple Silicon compatible client.
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,879
3,059
You make it sound like Apple has cobbled together their SoCs and are blowing away everyone else by accident, while the Apple Silicon Team is probably at this moment the premier silicon design team of the world.
That's a bizarre misread of my question. I never said that, nor do I think that. Equally bizarre was the multi-paragraph non-sequitur you provided in response. [And, with all due respect, it was a bit obnoxious on your part to presume I needed such a lecture on Apple's history.] If you want to see reasonable replies to my questions, look at what Leman wrote.

I think what happened here is you didn't understand my question, completely misinterpreted it to mean "Apple sucks" (the opposite of my view), and then, in place of the on-point technical reply I was hoping for, posted a fluffy, and competely unnecessary, diatribe defending AS's design history.

Maybe you were reacting to my use of the phrase "stitched-together". But that was just me responding to Leman's post, where he used that phrase; I used the same phrase so he'd know what I was referring to. Leman didn't mean that pejoratively, and neither did I; it was just being used as a way to refer to modularity. You would have known that if you'd bothered to read the post to which I was replying before posting your diatribe. Not doing so is always a bad idea, since that causes you to miss essential context.

Find your own references - it's all public record.
LOL, it doesn't work like that. It would take hours to research your many claims, and why would I waste hours checking the claims of some random guy on the internet? Particularly since you're obviously not an expert on this subject (that's an observation, not a criticism; most of us aren't experts). How do I know you're not an expert? It's obvious from the fluffy way you write about the subject that you're not a CPU designer/hardware engineer. Plus the fact that you so completely misread my question doesn't help your credibility.

I don't always give references in my shorter posts, but if someone asks I'd never obnoxiously write "find your own references", since I understand both their value, and the reasonableness of the request—it's part of the scientific culture I'm part of. And generally if I post something longer, like what you did, I will do the work to provide the references, such as can be seen here:

Are you required to provide references? Of course not. No one's paying us to post here! But don't be surprised if those of us who are technically serious don't take you seriously if you don't (or at least refuse to provide them upon request).
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,879
3,059
I don’t think there is much mystery in this. Apple does not have hardware RT support, so it boils down to performance. You can use real-time RT on M1 in some circumstances (e.g. shadows on simpler geometries), but its not nearly fast enough for demanding games.

The Metal RT API itself is solid. It seems feature-complete with DX12 Ultimate and it’s more capable for professional rendering applications (higher memory limits, built-in object animation for motion blur).
Are you saying that RT on the M1 is done in software (slower) (Metal RT API) rather than hardware (faster) ?

And is it expected that the "M1x" will have hardware RT support? Don't know much about RT, but from articles like this (https://www.pcmag.com/how-to/what-is-ray-tracing-and-what-it-means-for-pc-gaming) it sounds like Apple would need to offer hardware RT at some point if it wants to compete with the higher-end NVIDIA cards for gaming performance.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,515
19,661
Are you saying that RT on the M1 is done in software (slower) (Metal RT API) rather than hardware (faster) ?

Apple G13 (used in M1) has no RT acceleration hardware. RT on all of current Apple GPUs is done via regular compute shader hardware.

And is it expected that the "M1x" will have hardware RT support? Don't know much about RT, but from articles like this (https://www.pcmag.com/how-to/what-is-ray-tracing-and-what-it-means-for-pc-gaming) it sounds like Apple would need to offer hardware RT at some point if it wants to compete with the higher-end NVIDIA cards for gaming performance.

It is very obvious that Metal RT API is designed with hardware RT acceleration in mind. I expect Apple to release hardware capable of accelerating RT operations sooner rather then later. In fact, I’d be disappointed if the new Macs released this year won’t have it.
 
  • Like
Reactions: theorist9

theorist9

macrumors 68040
May 28, 2015
3,879
3,059
Apple G13 (used in M1) has no RT acceleration hardware. RT on all of current Apple GPUs is done via regular compute shader hardware.

It is very obvious that Metal RT API is designed with hardware RT acceleration in mind. I expect Apple to release hardware capable of accelerating RT operations sooner rather then later. In fact, I’d be disappointed if the new Macs released this year won’t have it.
I now recall your post from a year ago, in which you speculated that TBDR might offer a unique synergy with RT:

"What I am trying to say is that because of their tiled deferred architecture AND in combination with Imagination IP, Apple could be in a unique position in regards to high-performance ray tracing." (https://forums.macrumors.com/thread...ide-alus.2258523/?post=28984519#post-28984519)

Now that a year has passed, what's your current thinking on this?
 

leman

macrumors Core
Oct 14, 2008
19,515
19,661
I now recall your post from a year ago, in which you speculated that TBDR might offer a unique synergy with RT:

"What I am trying to say is that because of their tiled deferred architecture AND in combination with Imagination IP, Apple could be in a unique position in regards to high-performance ray tracing." (https://forums.macrumors.com/thread...ide-alus.2258523/?post=28984519#post-28984519)

Now that a year has passed, what's your current thinking on this?

Well, I was hoping to see some actual hardware by now, but overall, not much has changed IMO. Mind that what I wrote back then is just speculation. I don’t have a good enough understanding of the memory access patterns in RT or how the hardware coalescing works to offer concrete insights on these matters.
 

dapa0s

macrumors 6502a
Jan 2, 2019
523
1,032
I don't think the m1x will be good for gaming, and I think people will still need dedicated PCs for PC gaming, just like before, if not worse this time. Time will tell...
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
I don't think the m1x will be good for gaming, and I think people will still need dedicated PCs for PC gaming, just like before, if not worse this time. Time will tell...
The M1 is already good enough for games. Plenty of processing power, and a gpu better than an 1050 Ti. Cyberpunk minimum specs only require a GTX 780. The issue for M1 is not hardware. M1X will be even more capable.

The issue is devs currently don’t find Mac lucrative to spend money investing in because before M1, 80% of all macs sold couldn’t play most AAA games even if they were supported. Time will tell though now that M1 can theoretically handle any AAA game today. The Mac market just needs to widely adopt Apple silicon and until the majority of Mac users are on the new platform it likely won’t be lucrative for devs to even think about it.
 

VArase

macrumors regular
Feb 1, 2008
114
60
Chicagoland
That's a bizarre misread of my question. I never said that, nor do I think that. Equally bizarre was the multi-paragraph non-sequitur you provided in response. [And, with all due respect, it was a bit obnoxious on your part to presume I needed such a lecture on Apple's history.] If you want to see reasonable replies to my questions, look at what Leman wrote.

I think what happened here is you didn't understand my question, completely misinterpreted it to mean "Apple sucks" (the opposite of my view), and then, in place of the on-point technical reply I was hoping for, posted a fluffy, and competely unnecessary, diatribe defending AS's design history.

Maybe you were reacting to my use of the phrase "stitched-together". But that was just me responding to Leman's post, where he used that phrase; I used the same phrase so he'd know what I was referring to. Leman didn't mean that pejoratively, and neither did I; it was just being used as a way to refer to modularity. You would have known that if you'd bothered to read the post to which I was replying before posting your diatribe. Not doing so is always a bad idea, since that causes you to miss essential context.


LOL, it doesn't work like that. It would take hours to research your many claims, and why would I waste hours checking the claims of some random guy on the internet? Particularly since you're obviously not an expert on this subject (that's an observation, not a criticism; most of us aren't experts). How do I know you're not an expert? It's obvious from the fluffy way you write about the subject that you're not a CPU designer/hardware engineer. Plus the fact that you so completely misread my question doesn't help your credibility.

I don't always give references in my shorter posts, but if someone asks I'd never obnoxiously write "find your own references", since I understand both their value, and the reasonableness of the request—it's part of the scientific culture I'm part of. And generally if I post something longer, like what you did, I will do the work to provide the references, such as can be seen here:

Are you required to provide references? Of course not. No one's paying us to post here! But don't be surprised if those of us who are technically serious don't take you seriously if you don't (or at least refuse to provide them upon request).
I quite honestly do think you needed a history lesson.

Go back and reread your question - if you know anything about silicon design you know that the question was like asking if the head of neurosurgery knew how to properly apply a bandaid.

Accessing ECC or HBM memory is a trivial task compared to putting together NVMe controllers for a phone, or producing all the ASICs which accelerate all the common tasks handled by the A or M series SoCs ... or the design of processor cores which span the power budgets of a watch (like the adapted Thunder high efficiency cores of the A11) to a eight wide processor for a full blown computer.

You talk the talk and know a lot of the technobabble, but lack a deeper understanding of the issues at play.

P.S. You want references for a summary of around 13 years of Apple Silicon evolution? Seriously?

Oh, and an aside: Nothing about any well balanced instruction set makes it superior to any other. Fixed instruction lengths make it easier for your decoders to probe more deeply into the execution queue though, allowing you to produce wider CPUs.

Apple probably went with ARM for this reason, and the fact that Apple was an original stakeholder - well - when ARM became ARM (after sprouting out of that little Acorn), and Intel locks up their x86 IP tighter than the Coke and Colonel's recipes.

When memory became cheaper the economies of CISC instruction sets became a lot less advantageous and a lot more an impediment to optimization with their variable length instructions.
 
Last edited:

Admiralbison

macrumors regular
May 23, 2021
130
131
Currently I am faced with a dilemma, which laptop to buy.
I need a laptop that I can run all my code and game on it for fun.

I sold my gaming console because realized, I could get a laptop like the G15 or Blade 14 5900hx, and if will probably perform better then my ps4 pro.

But Apple is in my blood lol, I am wondering if the new M1x chipped MacBook pros (14in and 16in) will be a good option for gaming.

I know currently Mac’s pretty much suck at playing games, but I still have hope, and was wondering if anyone knows anything about it.
I’m with you in that I really just want one powerful laptop to do all my major things.
MacOs for home personal and media editing use
Windows 10/11 for work
Linux for automation, coding, gaming etc.

there are videos on YouTube you can check out that does benchmarks and looks at Win 10 gaming via Parallels and crossover on current M1 machines.

So far running via VM, x86 emulation/translation overheads the performance on a few games isn’t bad at mid settings 30FPS+ though this is mostly for the 16GB M1 variant.

So M1X being to be powerful GPU wise, Parallels 17 and Wins 10 (or Linux ) improving on arm based chips is looking positive.

Personally though, I’m waiting on the M2X when the M chips and Arm software has matured.
 

4743913

Cancelled
Aug 19, 2020
1,564
3,716
I’m with you in that I really just want one powerful laptop to do all my major things.
MacOs for home personal and media editing use
Windows 10/11 for work
Linux for automation, coding, gaming etc.

This is why I bought the Macbook Pro 16 today. I will wait and see how the Mx Macbook Pros look in a couple of years. I want everything now and the last of the intel macbook pros is the way to get it.
 

VArase

macrumors regular
Feb 1, 2008
114
60
Chicagoland
I’m with you in that I really just want one powerful laptop to do all my major things.
MacOs for home personal and media editing use
Windows 10/11 for work
Linux for automation, coding, gaming etc.

there are videos on YouTube you can check out that does benchmarks and looks at Win 10 gaming via Parallels and crossover on current M1 machines.

So far running via VM, x86 emulation/translation overheads the performance on a few games isn’t bad at mid settings 30FPS+ though this is mostly for the 16GB M1 variant.

So M1X being to be powerful GPU wise, Parallels 17 and Wins 10 (or Linux ) improving on arm based chips is looking positive.
I'm with you.

M1x should double or quadruple M1's graphic power, which is what seems to be bottlenecking M1 speed (gaming wise).
Personally though, I’m waiting on the M2X when the M chips and Arm software has matured.
Unfortunately, if M2x follows M1x core counts, the improvement (if year over year trends hold) will probably be something up to 20%.

M3x has the possibility of being a bit better as it involves a mask shrink to 3 or 4nm based on A16 cores.

The quantum leap comes from the transition from Intel to Apple Silicon, but you can't expect that improvement every year.
 
Last edited:

NotTooLate

macrumors 6502
Jun 9, 2020
444
891
Apple should hopefully put some money in AAA developers pockets , it would be great to launch the new computers with some nice games along side it to showcase the capabilities of the HW , they put 70M$ to buy a movie for the streaming service and billions overall , so its not they cant put some capital behind gaming IF they think its worthy , the way they are getting the spotlight for everything bad in the world , I can imagine the headlines of "Apple brings a game that has a black man shooting cops , the ppl told the Verge that a class action is on the way for Apple to remove a game called GTA".
 

TrueBlou

macrumors 601
Sep 16, 2014
4,531
3,619
Scotland
Unless Apple pull their fingers out and start convincing studios to make games for macOS, it won’t matter if it’s the most powerful gaming platform in the world, or the worst. There will be very little to take advantage of it anyway.

Personally, I got out of the PC gaming race when I switched to Apple a long time ago. I got tired of spending a fortune every year on the latest and greatest to get the best from the games - not that I was forced to, obviously.

These days, I’m happy with my console collection for the bulk of my gaming and I never actually gave up completely on PC gaming. For years now I’ve used GeForce Now (and other means before it) and despite its occasional limitations, I’m actually really happy with it. That along with Parallels, Crossover and DOSBox-X for the old DOS and Win95 games. Even though I still have a PC, it’s relegated to second server duties these days.
 

TSE

macrumors 601
Jun 25, 2007
4,030
3,544
St. Paul, Minnesota
Unless Apple pull their fingers out and start convincing studios to make games for macOS, it won’t matter if it’s the most powerful gaming platform in the world, or the worst. There will be very little to take advantage of it anyway.

Personally, I got out of the PC gaming race when I switched to Apple a long time ago. I got tired of spending a fortune every year on the latest and greatest to get the best from the games - not that I was forced to, obviously.

These days, I’m happy with my console collection for the bulk of my gaming and I never actually gave up completely on PC gaming. For years now I’ve used GeForce Now (and other means before it) and despite its occasional limitations, I’m actually really happy with it. That along with Parallels, Crossover and DOSBox-X for the old DOS and Win95 games. Even though I still have a PC, it’s relegated to second server duties these days.

Smart man. PC Gaming is uneconomical. People need to just get a console and be happy. The amount of time, money, and energy people spend on getting the latest and greatest hardware is just not logical.
 
  • Like
Reactions: l0stl0rd

Kpjoslee

macrumors 6502
Sep 11, 2007
417
269
Smart man. PC Gaming is uneconomical. People need to just get a console and be happy. The amount of time, money, and energy people spend on getting the latest and greatest hardware is just not logical.

Actually PC gaming represented pretty decent value until mining craze jacked up the GPU prices. There is no need to spend latest and greatest hardware when it comes to PC gaming. That only applies to a few PC enthusiasts who upgrades every time new hardware comes out.
 
  • Like
Reactions: Queen6

JouniS

macrumors 6502a
Nov 22, 2020
638
399
Smart man. PC Gaming is uneconomical. People need to just get a console and be happy. The amount of time, money, and energy people spend on getting the latest and greatest hardware is just not logical.
PC gaming is quite cheap if you are happy with a good enough system. (Everyone who buys sub-$3000 Macs should be like that.) As long as developers still target previous-generation consoles, games will be designed for 1080p resolution. You don't need a fancy GPU or an expensive monitor, and you only get slightly better graphics if you have them.

Eventually there will be games designed for 1440p or 4k resolution. We may start seeing them next year, as many expected titles are dropping the support for previous-generation consoles. In order to experience them as intended, you'll need a GPU that exceeds the capabilities of current-generation consoles: a high-end card from 2018 or a mid-range card from 2020. Not too bad for playing new games released in 2022. And because the time between console generations is usually ~7 years, those GPUs will be good enough for a long time.

Ultimately the choice between a PC and a console should be based on the games you play. Civilization just works better with a mouse, and sandbox games such as Kerbal Space Program are nothing without mods. And if you play RPGs, the chances are many important bug fixes are only available as mods, because the developers usually stop fixing them soon after the release.
 

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
Smart man. PC Gaming is uneconomical. People need to just get a console and be happy. The amount of time, money, and energy people spend on getting the latest and greatest hardware is just not logical.
Many hobbies aren't particularly economical. It depends on what you want. If you enjoy building gaming PCs and spending time deciding on upgrades etc, then economical really doesn't enter the equation. Does it really matter if your game runs at 150 fps or 220 fps? Maybe if you are a professional gamer and make money, for everyone else it is bragging rights.
 
  • Like
Reactions: MacCheetah3

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Smart man. PC Gaming is uneconomical. People need to just get a console and be happy. The amount of time, money, and energy people spend on getting the latest and greatest hardware is just not logical.
Logical would be putting every spare penny into investments. No hobby is logical, it just keeps you sane for a little while longer.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Does it really matter if your game runs at 150 fps or 220 fps?
PCMR e-peen depends on it.

Also to be pedantic, I did napkin math awhile back and determined the point of diminishing returns in fps is about 144-200hz. Visually speaking.

If it’s an engine with logic locked to fps, then the human reaction time (being faster than visual acuity) puts the fps “ceiling” at about 300-something hz.

So if you’re playing a game that relies heavily on reflexes, having extremely high fps is beneficial.

That said, some engines start bugging out at high fps, so ymmv.
 

diamond.g

macrumors G4
Mar 20, 2007
11,433
2,655
OBX
PCMR e-peen depends on it.

Also to be pedantic, I did napkin math awhile back and determined the point of diminishing returns in fps is about 144-200hz. Visually speaking.

If it’s an engine with logic locked to fps, then the human reaction time (being faster than visual acuity) puts the fps “ceiling” at about 300-something hz.

So if you’re playing a game that relies heavily on reflexes, having extremely high fps is beneficial.

That said, some engines start bugging out at high fps, so ymmv.
It isn't just high framerates, it is also being able to turn on all effects and ultra details (after all the developers/artists spent time on those assets) and not turning the game into a stuttery mess.
 

4743913

Cancelled
Aug 19, 2020
1,564
3,716
Actually PC gaming represented pretty decent value until mining craze jacked up the GPU prices. There is no need to spend latest and greatest hardware when it comes to PC gaming. That only applies to a few PC enthusiasts who upgrades every time new hardware comes out.

this is correct. two years ago I bought a 2013 Dell Optiplex from Woot for $400 and put a $150 GTX1650 in it. It required no PSU upgrade and has served me well.
 

SupremeMayo

macrumors regular
Sep 7, 2021
127
114
I would just get a console, no need to worry about upgrading every year or two. Thats the route I went with anyways, so I may be biased. Even if the M1X did have the graphics capability to run games, which it most likely will, there isn't enough support for macOS gaming to run them. Get a PS5 or Series X and you'll be golden, plus consoles are much more portable than PC's.
 
  • Like
Reactions: ader42 and l0stl0rd
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.