Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

M3gatron

Suspended
Sep 2, 2019
799
605
Spain
? No laptop with an OLED display is going to cost between $700-$800. Maybe the screen itself cost that much?
Here, 750$

Asus promised to put in most of its laptops, OLED screens.
But this is besides de point, the point is the SOC and the iGPU's capability to play games.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
Yeah, it's possible the moment that will truly revolutionize gaming on thin and light computers.
I also imagine that a desktop variant of that APU will put out an even more impressive show.
Unfortunately it’s likely to be quite bandwidth limited - there is no indication (yet! Hope springs eternal) that these chips include something like a large ”infinity cache” either. Shared 128-bit wide DDR5 isn’t very much, even compared to the low end XBox s, nevermind the four times as wide memory subsystem of the M1Max.

Other than that though, it seems like a nice chip and a much healthier evolutionary path than 200W CPUs and 400W GPUs so I hope to God AMD will actually start selling their nicer chips for the desktop AM5 platform. Dragon Range, targeted at beefier laptops, is probably a better candidate to eventually reach AM5.
 
Last edited:
  • Like
Reactions: Irishman

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Shared 128-bit wide DDR5 isn’t very much, even compared to the low end XBox s, nevermind the four times as wide memory subsystem of the M1Max.
Apple's GPU uses TBDR compare to IMR for dGPUs, so for rasterisation tasks, it should be OK.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Unfortunately it’s likely to be quite bandwidth limited - there is no indication (yet! Hope springs eternal) that these chips include something like a large ”infinity cache” either. Shared 128-bit wide DDR5 isn’t very much, even compared to the low end XBox s, nevermind the four times as wide memory subsystem of the M1Max.

More than enough when you consider various bandwidth-saving technologies in these chips and of course the fact that quite a lot of advanced effects can be done while never leaving the on-chip memory.
 
  • Like
Reactions: Irishman

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
More than enough when you consider various bandwidth-saving technologies in these chips and of course the fact that quite a lot of advanced effects can be done while never leaving the on-chip memory.
Well, ”performance” in computing is always relative, wouldn’t you say? The bulk usage of PC GPUs is gaming, so it’s meaningful to compare performance to likely target platforms for demanding titles, which these days are the mains powered consoles.
Still PS4/XBO, mind you, for some time yet but that will eventually shift to XBox s/PS5.
At 5nm (and even moreso at 3nm) the ALU capabilities of an ambitious APU will lag far less behind the consoles than the bandwidth will. We know that a hefty LLC can alleviate that to some extent, even though AMD has only used it on their dGPUs and avoided it on their APUs - but I’d still say that for the next few years, that will be the major weakness of PC integrated graphics vs. discrete GPUs, or for that matter more unified approaches such as Apples M1P/M/U that have much wider memory subsystems.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Well, ”performance” in computing is always relative, wouldn’t you say? The bulk usage of PC GPUs is gaming, so it’s meaningful to compare performance to likely target platforms for demanding titles, which these days are the mains powered consoles.
Still PS4/XBO, mind you, for some time yet but that will eventually shift to XBox s/PS5.
At 5nm (and even moreso at 3nm) the ALU capabilities of an ambitious APU will lag far less behind the consoles than the bandwidth will. We know that a hefty LLC can alleviate that to some extent, even though AMD has only used it on their dGPUs and avoided it on their APUs - but I’d still say that for the next few years, that will be the major weakness of PC integrated graphics vs. discrete GPUs, or for that matter more unified approaches such as Apples M1P/M/U that have much wider memory subsystems.

But that is the interesting part. Apple GPUs punch way above their assumed weight for rasterisation workflows. It's a little bit of a different philosophy. Traditional GPUs need huge VRAM bandwidth to hide latency and to support bandwidth-heavy rendering approaches (e.g. deferred rendering). These are less of a concern for Apple GPUs, especially after optimisations. And A15/M2 GPU even has support for frame buffer texture compression, which is another bandwidth-saving technique (most likely for their upcoming VR headset).

In other words, any GPU requires X points of bandwidth per Y points of ALU throughput to achieve balanced performance. But the required ratio of X/Y is considerably lower on Apple Silicon — at least for graphical tasks. Getting around bandwidth requirements for compute is more tricky — but then again, techniques like bandwidth compression and large caches help.
 

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
But that is the interesting part. Apple GPUs punch way above their assumed weight for rasterisation workflows. It's a little bit of a different philosophy. Traditional GPUs need huge VRAM bandwidth to hide latency and to support bandwidth-heavy rendering approaches (e.g. deferred rendering). These are less of a concern for Apple GPUs, especially after optimisations. And A15/M2 GPU even has support for frame buffer texture compression, which is another bandwidth-saving technique (most likely for their upcoming VR headset).

In other words, any GPU requires X points of bandwidth per Y points of ALU throughput to achieve balanced performance. But the required ratio of X/Y is considerably lower on Apple Silicon — at least for graphical tasks. Getting around bandwidth requirements for compute is more tricky — but then again, techniques like bandwidth compression and large caches help.
Oh, I fully agree that Apples GPUs are in a very nice position in terms of ALU vs. Bandwidth when directly targeted by the code! (As opposed to running code for targeted at Nvidia cards under Windows.)

I look forward to them releasing their developer videos on Metal 3 in general (today), and on their new upscaling in particular (tomorrow).

I have a feeling that the performance of Resident Evil on the new Macbook Air could turn out quite nice.
Pity I don’t like the genre.
 

leman

macrumors Core
Oct 14, 2008
19,522
19,679
In regards to Metal 3, what's often overlooked is feature availability. For example, mesh shaders are currently only supported on 4% of Vulkan devices, which are limited to some Nvidia models. But Metal 3 brings mesh shaders to all Macs sold in the last 4-5 years or so. That's a huge thing.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
In regards to Metal 3, what's often overlooked is feature availability. For example, mesh shaders are currently only supported on 4% of Vulkan devices, which are limited to some Nvidia models. But Metal 3 brings mesh shaders to all Macs sold in the last 4-5 years or so. That's a huge thing.
Are they doing mesh shading in software? Because none of the RDNA1 Macbook Pros support it in hardware (nor to any of the current Intel GPU's AFAIK).

So really it would only be Apple Silicon Macs that could take advantage of it (since we are ignoring Mac Pros with RDNA 2 GPU's).
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
So really it would only be Apple Silicon Macs that could take advantage of it (since we are ignoring Mac Pros with RDNA 2 GPU's).
1654690236146.jpeg
 
  • Like
Reactions: Irishman

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Are they doing mesh shading in software? Because none of the RDNA1 Macbook Pros support it in hardware (nor to any of the current Intel GPU's AFAIK).

So really it would only be Apple Silicon Macs that could take advantage of it (since we are ignoring Mac Pros with RDNA 2 GPU's).

Depends on what one understands by "in software" or "in hardware". I mean, one could make a good case that Apple Silicon does pixel shading "in software" as it seems to lack any dedicated shading hardware, just compute shaders dispatched over buffers of data...

Mesh shading is ultimately about running some compute shaders in tandem with the rasteriser. The original mesh shaders design was introduced by Nvidia, for Nvidia hardware, and it is entirely possible that it requires some special capabilities that RDNA 1 lacked, no idea. It is also entirely possible that Apple has designed their mesh shading API to be easily implementable on any hardware, or that on some hardware you are going to pay for some sort of overhead. I haven't looked at the specs yet and one would need to do targeted benchmarks to figure these things out.
 
  • Like
Reactions: Irishman

leman

macrumors Core
Oct 14, 2008
19,522
19,679
Right, but Mesh Shader hardware support only exists on the 6000 and Apple silicon.

Why do you assume that? Furthermore, why do you assume that mesh shading even needs any special „hardware“?

So while I don't disagree that Metal 3 will work on all the listed GPU's they must be doing something in CPU to get mesh shading (or doing some sort of conversion) on the hardware that doesn't support mesh shaders natively.

They will certainly do no such thing. Metal is a modern close to metal API. It’s not OpenGL, it doesn’t emulate features that can’t be implemented on a fast path. If it were not possible to implement mesh shaders efficiently on Intel, they would not support them at all.

By the way, that’s also why Metal doesn’t support geometry shaders: because they are a fundamentally broken design thst can’t be implemented efficiently on any hardware.
 
  • Like
Reactions: Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
Why do you assume that? Furthermore, why do you assume that mesh shading even needs any special „hardware“?



They will certainly do no such thing. Metal is a modern close to metal API. It’s not OpenGL, it doesn’t emulate features that can’t be implemented on a fast path. If it were not possible to implement mesh shaders efficiently on Intel, they would not support them at all.

By the way, that’s also why Metal doesn’t support geometry shaders: because they are a fundamentally broken design thst can’t be implemented efficiently on any hardware.
My assumption comes from the Windows/Linux side. If AMD's Vega/5000 series (or Intel UHD) supported Mesh Shading, it seems like they would have implemented it by now. Unless Apple is doing something fundamentally different than what is occurring on the DirectX/Vulkan side of the house. Which is 100% possible.
 
  • Like
Reactions: Irishman

Huntn

macrumors Penryn
May 5, 2008
24,004
27,087
The Misty Mountains
Edit: December 13th, 2021:

Apple will sell about 27 million Macs in 2021. Assume that 80% are M1/M1Pro/M1Max Macs, then Apple will have sold 21 million Macs capable of playing AAA games.

According to IDC, the number of gaming computers (desktops and laptops) sold in 2021 is projected to be 47 million. Thus, Macs are 36% of computers sold capable of playing AAA titles in 2021.




Original Post below

Tldr: Within 3 years, basic math suggests Macs will be 50% of all computers sold yearly capable of playing AAA games. It will finally make financial sense for AAA developers to port games to MacOS.

Before Apple Silicon:

  • Apple will ship ~17.5m Macs this year, representing about 11-12% of total U.S. PC market and 7-9% of total worldwide.
  • A very small percentage of those Macs can play any AAA games
  • If 20% of Macs sold are Macbook Pros 16" or better with a 5300m+ GPU, then that means if developers port their AAA games to the Mac, they are increasing the game's audience by only 2% (0.20 * 0.11). That's a lot of work for a very small audience gain.
  • Hence, AAA games rarely get ported to Macs
After Apple Silicon:
  • The M1 is as fast as a 1050Ti in gaming
  • The 1050ti is the second most common GPU according to the Steam Survey
  • This means AAA developers have to make games playable on a 1050ti level GPU
  • For CPU, the M1 is more than 2x faster than the most common Steam CPUs in both single-thread and multi-threaded benchmarks
  • Cyberpunk is the most demanding AAA game this year and targets an RX 470 and an i5-3570K in minimum requirements. The M1 is nearly as fast as the RX 470 and more than 2x faster than the i5-3570K. (Note: This is not saying that Cyberpunk is playable on the M1. I'm only comparing its min requirements to the M1.)
  • The M1 will be the slowest Mac chip Apple will ever make. Expect Apple Silicon chips to get much more powerful.
  • Ming Chi Kuo predicts that Mac shipments will increase by 100% within 3 years due to Apple Silicon, which means Macs will ship 35m units in 2023.
  • Every single one of the 35m Macs sold will be capable of playing AAA games from low to high settings
  • For comparison, the total number of PC gaming computers sold is 35m in 2019
  • All this means in 3 years, Macs will be 50% of all computers capable of playing AAA games sold each year
  • For AAA developers, that means the Mac gaming market goes from ~2% right now to about 50% within 3 years
Tim Cook wants this. Read the attached email chain between Tim Cook and his lieutenants.
That is if you can find the game you want that runs on the MacOS… 🤔
 
  • Like
Reactions: Janichsan

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
So what do we think about July's Steam hardware survey? Looks like macOS lost market share in July. Apple M2 is showing as +.22% already (which is cool).
 
  • Like
Reactions: Irishman

Colstan

macrumors 6502
Jul 30, 2020
330
711
So what do we think about July's Steam hardware survey? Looks like macOS lost market share in July. Apple M2 is showing as +.22% already (which is cool).
I've noticed discrepencies within the Steam survey over the years. There was one month where Linux went from over 2% to less than 1%. That's not a realistic fluctuation in just four weeks. Valve changed their metrics, not the market. Also, I noticed that the 1060 and 1050 Ti actually gained market share. I can't see a lot of those cards being sold right now, but I think that's just noise and most likely holding steady.

I would note that the reason that the 1050 and 1060 series have such a high market share is because they are commonly used in gaming cafes in Asia. There aren't many of those in Europe or North America. That skews the Steam data toward those cards because those PCs are in use far more than an average home computer, thus more likely to receive a submission request.

I like to look at the general trends over many months, which reduces some of the hiccups in the survey. Oh, and yeah, it's great to see the M2 make its debut, as well.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
I've noticed discrepencies within the Steam survey over the years. There was one month where Linux went from over 2% to less than 1%. That's not a realistic fluctuation in just four weeks. Valve changed their metrics, not the market. Also, I noticed that the 1060 and 1050 Ti actually gained market share. I can't see a lot of those cards being sold right now, but I think that's just noise and most likely holding steady.

I would note that the reason that the 1050 and 1060 series have such a high market share is because they are commonly used in gaming cafes in Asia. There aren't many of those in Europe or North America. That skews the Steam data toward those cards because those PCs are in use far more than an average home computer, thus more likely to receive a submission request.

I like to look at the general trends over many months, which reduces some of the hiccups in the survey. Oh, and yeah, it's great to see the M2 make its debut, as well.
Valve has never positioned the Steam HW Survey as a means to track the HW market have they? I cannot find it now, but I could have sworn I saw a thread somewhere that seemed to insinuate that Valve internally would like to stop doing the HW Survey, but it seems like developers have been actually using the data for minimum hardware requirements.


Supposedly they fixed the gaming cafe issue in 2019 so 1060's are supposedly not over represented.



Narrowing it down to just Apple I find it interesting how macOS reports resolution of internal panels (using the HiDPI rez not native) because it makes it look like most macOS users are running really low resolutions.
 
  • Like
Reactions: Colstan
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.