Fascinating. So a badly optimised game that gets 5 fps represents all games on a platform?
It shouldn't. But it seems to work fine the other way around. 🤷♂️ When a Windows game with bugs comes around it's "PC gaming sucks, there are only bugs. It's a great thing we don't get games on the Mac, so we don't have these! We rather have no games at all than any of that crap". And don't forget the hardware... "But on a PC it requires a 40x0 card to run it properly, that doesn't count! No one is buying these super expensive cards, it's just a tiny, tiny fraction of the market!!!11!!". But when a game runs poorly on a Mac all we hear is "just buy a maxed out Ultra, then it works!" (which is about $6k). And then we have Fort Solis coming to the Mac and all of a sudden this single title means "Mac gaming is thriving, absolute proof that Mac gaming isn't dead. Developers love the Mac and this is showing it!". So which one is it? Judging based on single or few examples or the full market? Or do we stick to "arguments are only valid when it benefits the Mac and no criticism allowed"?
Sometimes people sure create more problems than solving. When I have a game that runs with 5 fps on platform A, but with 60+ fps on platform B, then I just buy platform B and play there if I want to play the game.
I think some here will really wet their pants after WWDC if Apple manages in time to show the results of their latest shopping spree.