Yes. Apple has a closed ecosystem... so unless they decide to support it, its a dead end.I'd assume Apple would be the barrier, no?
Yes. Apple has a closed ecosystem... so unless they decide to support it, its a dead end.I'd assume Apple would be the barrier, no?
Not true. There is a huge difference between console gamers and desktop gamers. A desktop gamer will spend a fortune on a rig without even thinking about it. Consoles are no where near as powerful or customizable and often viewed as inferior to a desktop (PC).It’s a very very VERY niche market to have JUST a dedicated gaming system. People would rather spend $500 on a console than $1,500+ on a computer JUST to game. I have a $2,000 custom built system that I game on but I also work on it too.
A desktop gamer that spends a fortune on a rig without even thinking about it is the very definition of a niche marketNot true. There is a huge difference between console gamers and desktop gamers. A desktop gamer will spend a fortune on a rig without even thinking about it. Consoles are no where near as powerful or customizable and often viewed as inferior to a desktop (PC).
True.A desktop gamer that spends a fortune on a rig without even thinking about it is the very definition of a niche market
Inversely those high end gamers tend to keep their cards for a generation or two as it takes a strong upgrade to make switching worth while. Folks with a 1080TI had no real reason to upgrade to a 2080TI (lets ignore dlss and ray tracing) as the rasterization performance wasn't really that much faster, for the money. Those same 1080TI folk would have jumped to the 3080TI to get double the framerate (again rasterization) and much better dlss/RT performance.True.
I just had a look at Steams hardware survey, and the total number of RTX30xx and RX6xxx cards (the new generation of graphics cards from Nvidia and AMD) in the hands of gamers/miners is on the order of four million.
The number of PS5s and XBsx consoles sold since mid November is roughly 4-5 times as high. Next fiscal year, Sony predicts 24 million PS5s to be sold, and even if Microsoft is only half that, we are still looking at a HUGE disparity in high-end console vs. Gaming PC systems.
As I prefer gaming on computers vs mains powered consoles, this is not a good thing. But the numbers are very clear. Again though, the people buying those kinds of PC rigs are probably overall more accurately described as PC tech/shopping enthusiasts as opposed to people who are mostly interested in playing games.
Another comparison point is that Apple sold 6 million M1 Macs during the first quarter 2021 alone.
First off, I own a rather capable gaming PC myself, so anything I say that can be construed as insults against that group, well it applies to me as well.Inversely those high end gamers tend to keep their cards for a generation or two as it takes a strong upgrade to make switching worth while. Folks with a 1080TI had no real reason to upgrade to a 2080TI (lets ignore dlss and ray tracing) as the rasterization performance wasn't really that much faster, for the money. Those same 1080TI folk would have jumped to the 3080TI to get double the framerate (again rasterization) and much better dlss/RT performance.
AMD hadn't been playing in the high end GPU space for a while, but they also supplied the "whole kit" for 8th gen consoles and we know where they lay in the Console vs PC power spectrum.
The average user does not want to deal with Windows, worry about security issues, constant "game ready" drivers every game release to make the game perform well, and additional hardware like controllers.Not true. There is a huge difference between console gamers and desktop gamers. A desktop gamer will spend a fortune on a rig without even thinking about it. Consoles are no where near as powerful or customizable and often viewed as inferior to a desktop (PC).
The PC market would like to have a word with you... they have been dealing with Windows since inception. They still have the largest market share in the world. Who do you think drives the whole GPU hardware market anyways? Certainly NOT Apple.The average user does not want to deal with Windows, worry about security issues, constant "game ready" drivers every game release to make the game perform well, and additional hardware like controllers.
To get a decent gaming system that really beats out the M1 mac's performance is $1,000 while the consoles (ignoring the current shortage as I have been trying to get a PS5 for months now) is around $500.
The PC market would like to have a word with you... they have been dealing with Windows since inception. They still have the largest market share in the world. Who do you think drives the whole GPU hardware market anyways? Certainly NOT Apple.
The nutshell is, even if Apple could make the most powerful computer on the planet and sell it for $1, unless the game companies decide to write games for it, it really doesn't matter. Macs have come a long way over the years, hell the initial transition to INTEL was the one moment in time in which writing a game for both platforms was the easiest it was ever going to be... and yet, NOTHING came of it.
You think Windows making an ARM version of it's OS is going to change anything? It's not. You will still have two companies using their own proprietary graphics subsystems to keep each other worlds apart. And all those game companies making games for Windows will continue to do just that and Apple will still be exactly where it is today in terms of games support on the desktop. Which is exactly where they want to be.
Nothing is more niche than a gamer with a Mac. Nothing.
As someone with a $2,500 custom built PC, it’s frustrating getting games running properly. A console, it’s literally start up the game and your good.
I don’t need to constantly mess with drivers, game settings to make it run well, and deal with hooking up my Xbox Elite controller for games that don’t support it 100%.
We are the nice market here. My brother would rather get a console than build his own PC (to optimize the cost).
I haven't had such problems since the 90s. As long as you don't try anything released in the past couple of months (as you should do with all software and hardware anyway), things work just fine. Old games sometimes develop issues, because the hardware and the OS are too different from what they were designed for, but there are usually easy workarounds for them.As someone with a $2,500 custom built PC, it’s frustrating getting games running properly. A console, it’s literally start up the game and your good.
I don’t need to constantly mess with drivers, game settings to make it run well, and deal with hooking up my Xbox Elite controller for games that don’t support it 100%.
We are the nice market here. My brother would rather get a console than build his own PC (to optimize the cost).
Why do you "constantly mess with drivers"? Drives are updated regularly like any software. This isn't a hassle. It is just a matter of doing a normal install. And what games don't support standard controls? The Elite controller is not more or less compatible with games than a standard Xbox controller.
I'm not going to say PC gaming is just as easy as console, but it isn't difficult either.
Yeah NVidia isn’t going to make the mistake of releasing an architecture as good and long lived as Pascal again.Inversely those high end gamers tend to keep their cards for a generation or two as it takes a strong upgrade to make switching worth while. Folks with a 1080TI had no real reason to upgrade to a 2080TI (lets ignore dlss and ray tracing) as the rasterization performance wasn't really that much faster, for the money. Those same 1080TI folk would have jumped to the 3080TI to get double the framerate (again rasterization) and much better dlss/RT performance.
AMD hadn't been playing in the high end GPU space for a while, but they also supplied the "whole kit" for 8th gen consoles and we know where they lay in the Console vs PC power spectrum.
Strictly speaking, if different companies make their own SoC, then they’ll have to provide drivers for their own integrated gpu. And on the desktop we have at least three gpu companies now, Intel, AMD, and NVidia.You think Windows making an ARM version of it's OS is going to change anything? It's not. You will still have two companies using their own proprietary graphics subsystems to keep each other worlds apart.
And the general user won’t install drivers every time a new game comes out when NVIDIA releases their game ready drivers. I have seen some people’s gaming systems. They complain a brand new game won’t run well, yet they have a driver from 2020.
There are many games that don’t fully support controllers on Steam.
Consoles solve all of this for people like my brother.
Console Cyberpunk would like to have a word with you, lol.And the general user won’t install drivers every time a new game comes out when NVIDIA releases their game ready drivers. I have seen some people’s gaming systems. They complain a brand new game won’t run well, yet they have a driver from 2020.
There are many games that don’t fully support controllers on Steam.
Consoles solve all of this for people like my brother.
It runs well on my brother’s Series X.Console Cyberpunk would like to have a word with you, lol.
I have it on the PS5. I also know the PC version is superior, and that we may eventually get a next gen patch that will bring the XSX/PS5 version in line with the PC version.It runs well on my brother’s Series X.
And that is the whole point. Gaming is not a niche, but WE are the niche market. The general user would rather get a PS5 and Xbox Series X than a gaming PC. To those gamers, running at 1080p 30FPS is not a big deal. But to me and maybe you? I definitely prefer 144FPS when possible at 2560x1440.I have it on the PS5. I also know the PC version is superior, and that we may eventually get a next gen patch that will bring the XSX/PS5 version in line with the PC version.
At one point, I had a Power Mac G5 (2003ish) and a Non-descript run of the mill PC. They both had the same graphic cards in them (that I installed) and both ran Unreal Tournament. This was before Bootcamp, and there was a Mac version of UT. In comparison tests, the game on the PC ran at double the frames as compared to the Mac, 70fps vs 35 fps.When Apple came out with EGPU support, I saw a dim light at the end of the tunnel. I could dual boot my Mac and run a decent graphics card, even if there was a slight loss in frames from thunderbolt.
Now with Apple Silicon, the support for EGPUs have been dropped, and you no longer can dual boot and run Windows natively. So now if you want to game, you have to have a separately maintain another system or settle for Apple Arcade (yawn). Right now, I maintain a PC for gaming. I pull out a separate keyboard, mouse, audio interface each time I want to casually play. Each time I use a Microsoft product, I want to hang myself and makes me appreciate the Apple ecosystem more.
Gaming isn't just for nerds anymore. It's a common form of entertainment. The phrase "Macs don't game" really needs to die. Hopefully Apple Silicon can produce some killer GPUs to persuade gaming studios to develop their games on this platform.
This is my dream.
Wrath of the Righteous will have no Linux support? That's kind of a bummer. I've enjoyed the heck out of Pathfinder Kingmaker, whether I was playing on Mac, Linux or Windows. Perhaps this decision was due to the ease of getting Windows games to run under Steam and Proton, or Lutris even.I was concerned that the "big two" for this year, namely Baldur's Gate 3 and Pathfinder: Wrath of the Righteous, wouldn't have Mac versions, because it wasn't mentioned when these games were announced. When the actual system requirements were released, sure enough, both have Mac versions planned for release alongside the Windows version. I would note that Owlcat dropped Linux for Wrath, so the Mac must be profitable for them, while Linux is not. For at least some developers, there is a financial reason to have Mac support.
Not necessarily. Apple is mainly interested in making sleek, portable notebooks. That’s roughly 70% of the Mac user base. Since the MacBooks largely are not gaming laptops, Developers don’t want to spend resources on a sub-par experience. Apple would be glad to support AAA games but to do that they would’ve had to compromise on the design of their notebooks which they would never do. So it not that they aren’t interested, they just prioritize aesthetics over gaming. Seeing how successful they are, it seems like a good decision.No, the Mac Pro is a very very good example, it's you, you do not see the point. The point is Apple is not interested in making a gaming computer
I don’t believe Apple is intentionally putting barriers in place so people can’t game unless they buy a $5,000 computer. The Mac Pro was never designed as a gaming computer anyways. It’s a workstation. Sure you can game on it, but Tim Cook Isn’t sitting in some board room going “ahhh they want gaming macs ay? Ahhh let’s force them all to buy Mac pros then! Mmwhahahaha” as if it was some conscious decisions to somehow limit gaming on MacBooks?hence the point of Apple putting barriers in place to prevent people thinking their machines can be a dedicated gaming machine