...they literally said BG III didn’t drop a SINGLE frame.
Looked like some real low frame rate in the footage to me.
...they literally said BG III didn’t drop a SINGLE frame.
Reviving my thread now that we’ve had the event. What do you all think now that we’ve seen what Apple has up their sleeve? Personally, I don’t see a beginning of a new era for gaming on Mac. I was expecting a much bigger emphasis on GPU performance in the event, but all we got was a handful of mobile games and BG III, all of which looked like they were at very rough frame rates.
What do I think? I think it confirms what I was thinking all along. We now have a 10W passively cooled, $999 ultra-thin laptop with a performance of an entry-level dedicated dGPU. This will make Mac a much more attractive target for game development.
Speaking about that BG3 demo... they played more of it in a dev tech talk, and it was running higher settings with what appeared to be a better performance than what I can get out of my 5500M. The performance promises to be surprisingly good for games that are optimized for Apple's GPU — but it will be more then decent even for older Intel-based games.
I wouldn’t read too much into the performance differences since we don’t know what build Apple has versus what we have in Early access. Larian has said that over time the requirements will probably be reduced from what shows in Steam/GoG currently.What do I think? I think it confirms what I was thinking all along. We now have a 10W passively cooled, $999 ultra-thin laptop with a performance of an entry-level dedicated dGPU. This will make Mac a much more attractive target for game development.
Speaking about that BG3 demo... they played more of it in a dev tech talk, and it was running higher settings with what appeared to be a better performance than what I can get out of my 5500M. The performance promises to be surprisingly good for games that are optimized for Apple's GPU — but it will be more then decent even for older Intel-based games.
I wouldn’t read too much into the performance differences since we don’t know what build Apple has versus what we have in Early access. Larian has said that over time the requirements will probably be reduced from what shows in Steam/GoG currently.
I run it at the same settings on my 460 and see similar stuttering. Which leads me to believe it isn’t the GPU or CPU but likely asset loading causing hitching, which is silly cause we are all on NVMe storage. At this point I don’t understand why we still have loading longer than a couple of seconds on any Mac or iOS device. [Series S/X and PS5 load times for stuff is perception changing]I know, but still. I am running it at med-high settings with 1680x1050 resolution and it still stutters here and there. The M1 demo is running at 1080p and high settings — which is very impressive for a 15W chip.
...the video embedded on a web stream “looked” low frame rate while the *developer* of the game is heard in the voiceover talking about the buttery smooth and astounding playability with that chip. Ok. ?Looked like some real low frame rate in the footage to me.
Ultra settings they say in this video at 6:45: https://developer.apple.com/videos/play/tech-talks/10859/I know, but still. I am running it at med-high settings with 1680x1050 resolution and it still stutters here and there. The M1 demo is running at 1080p and high settings — which is very impressive for a 15W chip.
Ultra settings they say in this video at 6:45: https://developer.apple.com/videos/play/tech-talks/10859/
I'm guessing the big players (if they were watching at all) were skeptical of what Apple could do, and I highly doubt Apple reached out to them in the first place.What I found telling was how few well known companies were highlighted during this presentation. I truly don't know half of them nor can I remember who they were.
No. As, according to steam, there is no Mac version of that game.Guys this is gonna be dumb AF to ask, but would a Silicon Mac be able to run a lower-spec PC-only Steam game, like Persona 4 Golden? By doing some Rosetta witchcraft or something.
I'm incredibly stupid, I know.
I'd say no, if the Steam page says no Mac support, then there's no Mac support. Actually, even if the Steam page says there's Mac support, there's no guarantee there's Mac support. (Valve doesn't do quality control)Guys this is gonna be dumb AF to ask, but would a Silicon Mac be able to run a lower-spec PC-only Steam game, like Persona 4 Golden? By doing some Rosetta witchcraft or something.
I'm incredibly stupid, I know.
Rumors are that they're working on a dGPU. I'd go with that considering the surprising power of the iGPU of the m1 chips.I do wonder if they'll ever release their own discreet GPUs, maybe that wouldn't make sense. Perhaps they'll roll support for AMD on the higher-end.
While M1 looks really good for it's category, the TFLOPS are only a tenth of what is out there at the very high-end (extremely high budget and power, I know, but for those who need it / want it / can afford it).
I’m not really an expert in that, but it seems like apple has more synergies to gain when trying to develop the best SoC in the world.I'd say no, if the Steam page says no Mac support, then there's no Mac support. Actually, even if the Steam page says there's Mac support, there's no guarantee there's Mac support. (Valve doesn't do quality control)
Rumors are that they're working on a dGPU. I'd go with that considering the surprising power of the iGPU of the m1 chips.
I do wonder if they'll ever release their own discreet GPUs, maybe that wouldn't make sense. Perhaps they'll roll support for AMD on the higher-end.
While M1 looks really good for it's category, the TFLOPS are only a tenth of what is out there at the very high-end (extremely high budget and power, I know, but for those who need it / want it / can afford it).
I can't see apple making a 350-watt space heater GPU any time soon either. Am I the only person who's annoyed at the out of control power consumption that high-end GPUs are running? Likewise, the mid-and-low end GPUs are completely stagnant, with Polaris still being AMD's low-end answer 4 years on, and NVidia recommending GTX1650's still.GPUs can be scaled up I don’t really expect RTX 3090-level in Apple Silicon any time soon, but the baseline performance will see a big boost.
GTX1650 is Turing. So it isn’t that old. Polaris has long legs thanks to the consoles, that should be changing soon. If anything they have appeared to stick with Vega longer than it feels like they should. The Zen 3 APU should be using RDNA (probably 2) this time around.I can't see apple making a 350-watt space heater GPU any time soon either. Am I the only person who's annoyed at the out of control power consumption that high-end GPUs are running? Likewise, the mid-and-low end GPUs are completely stagnant, with Polaris still being AMD's low-end answer 4 years on, and NVidia recommending GTX1650's still.
GTX1650 is like half Turing. Somewhere between Pascal and Turing iirc. Also Vega stuck around because it was surprisingly efficient in APU form, it just didn't scale up. Either way, there's now a segregation between "high-end" GPUs which use massive amounts of power, cost ludicrous amounts of money, and are on the latest arch, and "low-end" GPUs which use reasonable amounts of power, for reasonable(ish) prices, but are on generations from years behind.GTX1650 is Turing. So it isn’t that old. Polaris has long legs thanks to the consoles, that should be changing soon. If anything they have appeared to stick with Vega longer than it feels like they should. The Zen 3 APU should be using RDNA (probably 2) this time around.
Everything we’ve heard so far about Apple silicon sounds great in terms of performance but one area that I am still very curious about is the gpu and gaming. A lot of people are saying this will be the final nail in the coffin for any chance to have high end gaming on the Mac but I’m not so sure. Apple already has been making some pretty impressive gpus on their A chips in both the iPhone and iPad. There are games on the iPad that I would say are approaching console level quality (not newest consoles of course) and lets not forget that it is pushing a 120hz high resolution screen.
I have a feeling that we are going to be quite impressed with the graphics on the new Mac chips since they can increase the size and provide more cooling to it for better thermals. I could see a future 4-5 years from now where, as long as devs also make their games compatible with Apple silicon, we could finally have high end gaming on the Mac.
Everything we’ve heard so far about Apple silicon sounds great in terms of performance but one area that I am still very curious about is the gpu and gaming. A lot of people are saying this will be the final nail in the coffin for any chance to have high end gaming on the Mac but I’m not so sure. Apple already has been making some pretty impressive gpus on their A chips in both the iPhone and iPad. There are games on the iPad that I would say are approaching console level quality (not newest consoles of course) and lets not forget that it is pushing a 120hz high resolution screen.
I have a feeling that we are going to be quite impressed with the graphics on the new Mac chips since they can increase the size and provide more cooling to it for better thermals. I could see a future 4-5 years from now where, as long as devs also make their games compatible with Apple silicon, we could finally have high end gaming on the Mac.
AMD and Nvidia can’t secure enough production for the full RDNA2 and Ampere parts. Which is where they make the most money. We won’t see the “cut down” parts for a while.GTX1650 is like half Turing. Somewhere between Pascal and Turing iirc. Also Vega stuck around because it was surprisingly efficient in APU form, it just didn't scale up. Either way, there's now a segregation between "high-end" GPUs which use massive amounts of power, cost ludicrous amounts of money, and are on the latest arch, and "low-end" GPUs which use reasonable amounts of power, for reasonable(ish) prices, but are on generations from years behind.
Actually, with Apple Silicon they're gonna do better than that, because they're shooting for processor updates yearly, so it's looking like each successive generation of both CPU and GPU are gonna be full-featured. Even if they fall short of 3090 territory, I think it's a better way.