Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
Apple is not targeting geforce RTX levels of performance right now. They just need to do better than competing low-power GPUs, and what they showed with tomb raider suggest than they will.
I can talk at length about the tomb raider demo, but let's just say that no current laptop iGPU can achieve what was shown.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Compared to Intel's iGPUs. Not AMD's or Nvidia's dedicated chips.

Compared to whatever model the Apple Silicon Mac in question replaces :) And no, an 8-core 8 watt Apple GPU is not going to outperform a 50 wattAMD dGPU. A 16-core 30 watt though...

At any rate, I don’t expect “larger” Apple GPUs to appear before mid/late 2021. At the same time, I am sure that they will easily outperform anything else in the lower-mid range. I am not confident that Apple will ever be able to deliver high-end performance, but in the sub-50 watt space no forward renderer can compete with a TBDR architecture
 

Falhófnir

macrumors 603
Aug 19, 2017
6,146
7,001
Compared to Intel's iGPUs. Not AMD's or Nvidia's dedicated chips.
They've got to have something pretty potent in the graphics department or they wouldn't be jettisoning AMD as they have stated is their aim in the WWDC developer slides. They're not going to make a multibillion dollar decision while leaving something like that unresolved!
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
So tired of that crappy demo being used as an example. You can run that demo well on Intel iGPU at the lowest settings and resolution which that demo seemed to be. It's definitely not even 1080P ultra mode that's for sure.

Show me RDR2/Cyberpunk running well on AS at decent resolution and settings and I'll say Apple Silicon is now AAA tier...

That COD title is NOT the console/desktop version. It's most likely going to be watered down IOS version of which most are...
mobile Call of Duty uses Unity instead of the engine they use for Console and PC. I would be surprised if they decided to port Black Ops Cold War to iOS.
Apple is not targeting geforce RTX levels of performance right now. They just need to do better than competing low-power GPUs, and what they showed with tomb raider suggest than they will.
I can talk at length about the tomb raider demo, but let's just say that no current laptop iGPU can achieve what was shown.
Not even Tiger Lake with the new Xe GPU?
 

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
So tired of that crappy demo being used as an example. You can run that demo well on Intel iGPU at the lowest settings and resolution which that demo seemed to be. It's definitely not even 1080P ultra mode that's for sure.

Show me RDR2/Cyberpunk running well on AS at decent resolution and settings and I'll say Apple Silicon is now AAA tier...

That COD title is NOT the console/desktop version. It's most likely going to be watered down IOS version of which most are...

Theres already an iOS version. This is a new one which I assume will be better or what's the point? Activision is describing it as AAA. I suspect they know better than you. But no, Apple Silicon isn't AAA yet, the higher end hardware isn't out yet though.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
Theres already an iOS version. This is a new one which I assume will be better or what's the point? Activision is describing it as AAA. I suspect they know better than you. But no, Apple Silicon isn't AAA yet, the higher end hardware isn't out yet though.
What is funnier is the Mobile version is made by Tencent. Not infinity ward or treyarch.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
Not even Tiger Lake with the new Xe GPU?
Here you go. The intel Xe G7 achieves 25 fps average on 1080p lowest. I've determined that the demo run at WWDC used some medium settings (including ambiant occlusion, and normal LoD) with low textures quality. It ran at 1080p, as apple said themselves, and at >30 fps since there is almost no frame dropped in the 30 fps video stream.
I've also determined that the game section played at WWDC has slightly lower frame rate on average than the benchmark scenes overall (on my Mac at least)

So we have 25 fps on lowest settings for the intel Xe versus >30 fps on medium settings with low texture quality (and filtering) for the Apple part running under emulation.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
Here you go. The intel Xe G7 achieves 25 fps average on 1080p lowest. I've determined that the demo run at WWDC used some medium settings (including ambiant occlusion, and normal LoD) with low textures quality. It ran at 1080p, as apple said themselves, and at >30 fps since there is almost no frame dropped in the 30 fps video stream.
I've also determined that the game section played at WWDC has slightly lower frame rate on average than the benchmark scenes overall (on my Mac at least)

So we have 25 fps on lowest settings for the intel Xe versus >30 fps on medium settings with low texture quality (and filtering) for the Apple part running under emulation.
Intel showed a more complex scene than Apple did, so I am not sure how that is comparable.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
As I said, the average frame rate during the benchmark is slightly higher than the typical frame rate during the scene demoed at WWDC (which does not vary much). I've checked it myself. IOW, the WWDC section does not run more fluidly than the average of the benchmark sections (this is partly due to the high frame rate of the beginning of the third benchmark section, which only shows mountains).
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Here you go. The intel Xe G7 achieves 25 fps average on 1080p lowest. I've determined that the demo run at WWDC used some medium settings (including ambiant occlusion, and normal LoD) with low textures quality. It ran at 1080p, as apple said themselves, and at >30 fps since there is almost no frame dropped in the 30 fps video stream.
I've also determined that the game section played at WWDC has slightly lower frame rate on average than the benchmark scenes overall (on my Mac at least)

So we have 25 fps on lowest settings for the intel Xe versus >30 fps on medium settings with low texture quality (and filtering) for the Apple part running under emulation.

Eh, I don't know. Extrapolating from numbers we have on available benchmarks, I'd say that Intel Xe (28W TDP version) is at least 30-50% faster than the A12Z. I am really skeptical about the Tomb Raider benchmarks, too many factors at play here.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
Extrapolating from numbers we have on available benchmarks,
Which benchmarks? It's hard to find a test tool that runs on both intel and Apple GPUs. I found results form GFXbench that show the A12Z to be much faster than the "Intel Xe graphics" (I don't know if it was the model with 96 CUs or not).
I don't think it's hard to extrapolate from the tomb raider demo. I've just checked again : performance is about the same as in the benchmark section (on my Mac). Frame rate is quite homogenous through the game (the first part of the third benchmark section being an exception, and that's not a scene you can see in game). So when the intel Xe yields 26 fps on average at the lowest setting and the Apple GPU yields >30 fps at low-medium settings, then it's safe to say that the Apple GPU is much better at running this game.
Finally, we don't know for certain that the WWDC demo was run on the A12Z. But that YouTube video appears to use the 28-W intel part...

EDIT: the intel part is much better then the A12Z on "low level" GFXBench tests.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Which benchmarks? It's hard to find a test tool that runs on both intel and Apple GPUs. I found results form GFXbench that show the A12Z to be much faster than the "Intel Xe graphics" (I don't know if it was the model with 96 CUs or not).

You know what, you might be right. The Xe-LP seems to rather well in 3dmax, with many youtubers comparing it to the MX350. But the MX350 is just 30-40% faster than MX150, and we know that already the 2018 iPad outperforms the MX150 with a healthy margin. Furthermore, the GFXbench results are compatible with all the claims (Xe being similar to MX350 etc.). But we are still mostly wading through murky water here, without properly controlled environment most of these tests are just guesses...
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
How do you explain the difference between high-level and low-level test results on GFXbench ? I like to think that the Apple GPU makes much better use of the hardware. The intel part may have more compute power on paper, but it does not take advantage of it as well as the Apple part. This may also be due to half-precision shaders being available on the Apple GPUs.
 

Waragainstsleep

macrumors 6502a
Oct 15, 2003
612
221
UK
Nah the Mobile version straight up uses Unity. The other versions uses Infinity Wards in house engine (IW Engine) which apparently is based on idTech 3 (heavily modified).

I mean the new game they are building is a version of Warzone. I have no idea what engine it will use.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
How do you explain the difference between high-level and low-level test results on GFXbench ? I like to think that the Apple GPU makes much better use of the hardware. The intel part may have more compute power on paper, but it does not take advantage of it as well as the Apple part. This may also be due to half-precision shaders being available on the Apple GPUs.
I brought up the concern about half precision shaders before. We don't know if SotTR was ported using half precision shaders on the Mac either.
The other annoying thing I found with GFXBench is on the Windows side, you can really only use the DX11 render for some comparisons. For some reason they didn't port all the tests to DX12 or Vulkan.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
SoTR cannot use half-precision shaders since they are not supported by current desktop GPUs. It is my understanding that Metal provides support for half-precision shaders on iOS only (for now).
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
SoTR cannot use half-precision shaders since they are not supported by current desktop GPUs. It is my understanding that Metal provides support for half-precision shaders on iOS only (for now).
Ignoring Xe half precision shaders are supported by desktop GPUs (at least ones in the last 2 generations) but on Windows the available APIs tend to just map the request to single precision anyways.

@leman would have to speak to Metal support for half precision in macOS.
 

Janichsan

macrumors 68040
Oct 23, 2006
3,126
11,925
I mean the new game they are building is a version of Warzone. I have no idea what engine it will use.
You mean this one? Note the liberal use of the words "maybe", "might", "could", and other subjunctive forms.

SoTR cannot use half-precision shaders since they are not supported by current desktop GPUs. It is my understanding that Metal provides support for half-precision shaders on iOS only (for now).
Half-precision floating point formats have been supported on other vendors' GPUs for years. If Metal didn't support them, that's on Apple.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,459
953
I doubt that SoTR differs between the Mac and PC versions with respect to shader precision. Someone from Feral would be able to clarify.
Quoting anantech from 2018 :
But for better or worse, when the PC moved to unified shaders, the industry moved to FP32 for all GPU functions. This is as oppposed to the mobile world, where power is an absolute factor for everything, Vertex shaders are typically 32bpc while Pixel and Compute shaders can often be 16bpc. We’ve seen some movement on the PC side to use half-precision GPUs for compute, but for gaming, that’s not currently the case.

EDIT: Apparently, things may have changed since then. But I doesn't really matter. Comparisons have shown that the Metal version of SoTR does not run better than the DX12 version on the same hardware. It's slightly slower in fact, so there is no evidence for some "unfair" advantage for the macOS version. In fact, the opposite may be true, since the Metal version is ported from the Windows version using some translation tool (Feral's indirectX).
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
You mean this one? Note the liberal use of the words "maybe", "might", "could", and other subjunctive forms.


Half-precision floating point formats have been supported on other vendors' GPUs for years. If Metal didn't support them, that's on Apple.
Interestingly it looks like they are hiring for Santa Monica (we assume Treyarch) even though Warzone was done by Infinity Ward. The original CoD:M was done by Tencent.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I brought up the concern about half precision shaders before. We don't know if SotTR was ported using half precision shaders on the Mac either.

I am also curious about benchmarks such as GFXbench. Does it use reduced precision on mobile devices? There is just no information. GFXbench has the really helpful render quality test, but it doesn't seem to be available in the iOS version...


SoTR cannot use half-precision shaders since they are not supported by current desktop GPUs. It is my understanding that Metal provides support for half-precision shaders on iOS only (for now).

Desktop CPUs have supported half-precision natively for a while, depending on the vendor. Modern GPUs (AMD Vega and later, Nvidia Turing or later, Intel since Skylake) in addition have double-rate FP16 implementation (meaning that working on FP16 is twice as fast as working on FP32). Apple GPUs have supported double-rate FP16 since forever (don't know when exactly, they likely inherited that from PowerVR GPUs).

Bottomline: all Macs that have been shipping since 2016 have GPUs with native FP16 support and most of them have double-rate FP16 throughput as well.

And yes, Metal fully support half-precision on macOS.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.