Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Agree. Anybody trying to defame Feral by generalizing and claiming they're "known" to do quick & dirty ports with poor performance and graphics can't be aware of their good reputation in the Mac community. Neither can they have played many Feral ports. Feral is famous for its good quality ports and excellent customer support and I'm not speaking only from my own experience.

I haven't played all their games since I'm more into FPS and action but I can only think of two games with "bad" performance, the latest Warhammer 3 and the old OpenGL based Sleeping dogs but both of them had also bad PC ports to begin with.

Speaking of poor graphics it's funny people seem to ignore the fact that Feral was nominated and one of the finalists at WWDC 2022 for Apple Design Award in the category Visuals and Graphics for their mobile port of Alien Isolation. "Winners in this category feature stunning imagery, skillfully drawn interfaces, and high-quality animations that lend to a distinctive and cohesive theme."

Here are some other facts about Feral ports and their graphics quality every game programmer should know before passing judgment:

- Shadow of the Tomb Raider was nominated for "Art Direction" and "Lighting/Texturing" in 2019.

- Life is Strange 2 was nominated for ”Best Graphics” in 2019.

- Hitman won award for ”Best Game Design” and was nominated for another ”Best Game Design” and ”Best Visual Design” in 2017.

- XCOM 2 was nominated for ”Game Design, Franchise” in 2017.

- Rise of the Tomb Raider won award for "Art Direction" and "Lighting/Texturing" and was nominated for "Outstanding Achievement in Art Direction", "Excellence in Visual Achievement" and "Excellence in SFX" in 2016.

- Deus Ex MKD won "Best Game Design" in 2016.

- Alien Isolation won award for ”Lighting/Texturing” and was nominated for ”Game Design” and ”Outstanding Real-Time Visuals in a Video Game” in 2015.

- Shadow of Mordor won award for "Game Design", "Outstanding Achievement in Game Design" and "Excellence in Design and Direction" and was nominated for "Best Design", "Art Direction, Fantasy" and "Lighting/Texture" in 2015.

- Batman: Arkham City won award for "Lighting/Texturing" and was nominated for ”Best Design”, ”Outstanding Achievement in Art Direction”, ”Best Game Design” and ”Best Graphics” in 2011-2012.

- Deus Ex: Human Revolution won award for "Best Design” and was nominated for ”Best Visual Arts” in 2012.

- Bioshock 2 was nominated for ”Art Direction, Fantasy”, ”Game Design, Franchise” and ”Lighting/Texturing” in 2011.
Are these awards for Ferals ports or for the games in general, because I am not sure how Feral would have any bearing on the awards given out. Do they collaborate with the IP owners and make changes to the games in a material way?
 
  • Like
Reactions: orionquest

leman

macrumors Core
Oct 14, 2008
19,521
19,678
Are these awards for Ferals ports or for the games in general, because I am not sure how Feral would have any bearing on the awards given out. Do they collaborate with the IP owners and make changes to the games in a material way?

And to add to that — not that I want to pick on Feral or anything — but it's' not like they have much competition in the market of porting high-budget games to Mac. It's a little bit weird to mention all the awards if there is only one company in this space.
 
  • Like
Reactions: Unregistered 4U

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
Are these awards for Ferals ports or for the games in general, because I am not sure how Feral would have any bearing on the awards given out. Do they collaborate with the IP owners and make changes to the games in a material way?

And to add to that — not that I want to pick on Feral or anything — but it's' not like they have much competition in the market of porting high-budget games to Mac. It's a little bit weird to mention all the awards if there is only one company in this space.

The first Apple Design award is obviously for Feral's mobile port as I wrote. The other awards are for the games themselves so I'm not trying to give credit to Feral for that. My post was not about your previous comment Leman about Feral's games being a hit or miss for you. As I mentioned myself there are some ports with poor performance but as you said yourself it's because of original bad PC games they have to work with.

My point was of course that the claim about Feral being ”known” for bad ports with poor graphics which been repeated several times is simply false with all these visually award-winning games and ports. To say those ports have poor graphics is to claim that Feral has somehow messed up the quality and made them look like crap which is incorrect. The graphics quality is the same between the ports and the original games. That’s also the reason why a port can have poor graphics. It’s because of the original game and its defects. A self-proclaimed experienced programmer should know this before making a biased general assumption about the subject instead of looking at the facts about the games/ports.
 
  • Like
Reactions: Irishman and leman

orionquest

Suspended
Mar 16, 2022
871
791
The Great White North
Are these awards for Ferals ports or for the games in general, because I am not sure how Feral would have any bearing on the awards given out. Do they collaborate with the IP owners and make changes to the games in a material way?
Besides were there any other titles in these categories? These awards sounds like "just for showing up" type awards.
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
Besides were there any other titles in these categories? These awards sounds like "just for showing up" type awards.
Not sure what your point is. When a game is nominated or wins for its visual quality among many other games on the market it means it doesn't have poor graphics contrary to what has been said here about those mentioned games/ports. It's as simple as that. If you want to know more about the games you can just check IMDB or Wikipedia.
 
Last edited:
  • Like
Reactions: Irishman

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
- XCOM 2 was nominated for ”Game Design, Franchise” in 2017.
I don’t know whether to blame Feral or Firaxis but XCOM2 ran terribly in my experience. Also there was a graphical bug after some update where Playstation buttons would show in menus.

That’s beside the fact that XCOM2 is just a bad game in general.
 
  • Like
Reactions: Irishman

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
I had hopped that Apple GPUs would perform well on the latest Feral game (Total war Warhammer 3, which requires Apple Silicon), but halas, they don't.

In what situations do M1(X) GPUs perform as advertised by Apple? I mean beside GFXBench Aztec high...

As others have said already Warhammer 3 has poor performance on PC too to begin with and requires a RX 6700 XT 12GB or RTX 3060 Ti 8 GB to deliver locked 60 fps at 1080 ultra. The game is also not completely native and still runs through Rosetta. The reason for it requiring M1 is that there are problems with the game on AMD GPUs in Macs and Feral is working to bring it to Intel Macs to. Another thing to remember when testing is that you have to connect your laptop to the power outlet otherwise you just get locked 30 fps. If we compare the results M1 Ultra 64c performs somewhere between RX 5700 XT and RTX 2070 Super in the game and M1 Max 32c performs like GTX 1660 Super or RTX 3050 at those settings.

So no, it’s not Feral’s fault. Some suspect it's also Denuvo's fault but even PC reviewers admit the requirements are super high. Here are some conclusions from PC reviewers:

Techpowerup: "For 1080p Full HD at 60 FPS, you should have an RX 6700 XT or RTX 3060 Ti at least, and for fluid 1440p, an RX 6800 XT, RTX 3070 Ti, or RTX 2080 Ti is close enough with 58.6 FPS. For fluid 4K, you'll have to wait for faster hardware as not even the mighty RTX 3090 gets 60 FPS and is rather far from that goal with just 44.7 FPS. The RX 6900 XT, AMD's fastest card, only manages 35.5 FPS. No, there is no secret always-on ray tracing in Total War: Warhammer III as it's a DirectX 11 game.

These requirements are SUPER high, and totally unreasonable given the graphics offered. Sure, there's lots of units on-screen, but other than that, I'm not seeing anything we haven't seen in other titles, with much better performance. On Reddit, some conspiracy theories tout that the reason for the low FPS is the Denuvo copy protection. Apparently, press review builds came without Denuvo and performance was much better. I'm not convinced of that yet, as the Denuvo developers are very clear in their instructions on how to properly implement their protection with minimal effect on performance. Maybe some kind of bug that sneaked into the final release build will be fixed down the road. Either way, let's hope Creative Assembly can identify the problem and fix it. We used the latest game-ready drivers from both AMD and NVIDIA for all our testing, so I doubt we can expect more optimizations from the GPU vendor side.

Until then, you can only reduce details, which is easy as there are lots of graphics options. The difference between the graphics presets is quite reasonable, and the FPS gains are good. For example, going from Ultra to High will give you around 20% higher FPS, and going to Medium even adds +50% to the framerates. Once again, for a strategy game, graphics aren't the most important thing, but it's still sad that developers (of all genres) keep releasing such unoptimized titles."

Kitguru - "I’ve been using total war games as part of my GPUs reviews for a few years now and these games have always been pretty demanding on your hardware, but Warhammer III definitely takes it to the next level.

Of course, there’s no denying that there is a lot of going on during the in-game battles – there's often significant numbers of units on screen and they use quite detailed character models considering how many there can be and the fact that the player is usually viewing the map from a relativerly high position. The game’s hero characters can also produce a lot of particle and alpha effects which can stress the GPU, as do the detailed map environments.
Still, all that said, there is no denying GPU requirements are punishingly high. For 1080p Ultra settings, the RX 6700 XT was the first GPU we tested able to deliver a locked 60FPS, and at 1440p only the RX 6800 XT or faster kept the 1% lows above 60FPS.

Nvidia GPUs also have a clear issue with frame pacing. Total War games do usually favour Nvidia hardware, and that’s certainly what we saw when looking at the average frame rates, but frame pacing and the 1% lows are arguably even more important, and right now there is a lot of inconsistency in the frame times for Nvidia GPUs, even at Medium settings. That could well be why we’ve seen so many reports of poor performance and it’s definitely something Nvidia and Creative Assembly need to look into.

The good news is we found the game’s Medium preset to run significantly better, delivering playable frame rates on every GPU we tested at 1080p, even for slower cards like the GTX 1650. Granted it doesn’t look as detailed as the Ultra preset, with sparser environments and fewer on-screen units, but it will give you a big boost in performance. I do just recommend sticking with TAA instead of FXAA as it cuts out a lot of the shimmer and instability, just for a small performance penalty."
 
Last edited:

jeanlain

macrumors 68020
Mar 14, 2009
2,461
955
Sure, the game performs poorly on the PC, but we're talking about Mac vs Windows performance. According to Leman, the game runs much more smoothly on Windows with comparable hardware.
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
Sure, the game performs poorly on the PC, but we're talking about Mac vs Windows performance. According to Leman, the game runs much more smoothly on Windows with comparable hardware.

Yes, it’s a combination of bad PC code and Rosetta port, but if we look at it differently it becomes more understandable that the results are relatively not as bad as they look. I think it’s better to compare the results by percentage instead of trying to find comparable hardware. One way to look at it is to compare how demanding WH 3 is to another game on the same hardware. Fortunately Shadow of the Tomb Raider is another demanding game also in that video. Both games are also tested at 1080 ultra settings.

SOTTR 1080p Highest preset
RTX 3090 - 219 fps
M1 Ultra 64c - 112 fps
M1 Max 32c - 80 fps

Total War: Warhammer 3 1080p Ultra preset
RTX 3090 - 138 fps
M1 Ultra 64c - 69 fps
M1 Max 32c - 38 fps

GPU performance comparison between TW: WH 3 and SOTTR
RTX 3090 - 138/219 63%
M1 Ultra 64c - 69/112 61.6%
M1 Max 32c - 38/80 47.5%

This shows that both 3090 and M1 Ultra perform almost equally worse in WH 3 (-38%) and the game is equally demanding for both GPUs compared to other games like SOTTR.

Another way to look at it is to compare the performance of M1 to 3090 in those two games.

SOTTR 1080p Highest preset
M1 Ultra 64c vs. RTX 3090 - 112/219 51.1%
M1 Max 32c vs. RTX 3090 - 80/219 36.5%

Total War: Warhammer 3 1080p Ultra preset
M1 Ultra 64c vs. RTX 3090 - 69/138 50%
M1 Max 32c vs. RTX 3090 - 38/138 27.5%

This shows that M1 Ultra and Max perform almost equally good in both games compared to 3090. Even though their frame rates appear much lower than 3090 in WH 3 their performance is the same in both games compared to 3090, i.e. 30-50% of the performance of 3090. It’s the game’s source code and bad optimization and the non-native Rosetta port that cause the low frame rates on Macs. It becomes more obvious and unacceptable when the frame rate drops down to 38 on M1 Max but not so much to complain about when you have 80 fps in SOTTR.
 
  • Like
Reactions: Irishman

mi7chy

macrumors G4
Oct 24, 2014
10,625
11,296
SOTTR 1080p Highest preset
RTX 3090 - 219 fps
M1 Ultra 64c - 112 fps
M1 Max 32c - 80 fps

That's not a valid comparison since 3090 is more for specialized apps that benefit from huge amounts of high speed GDDR6X VRAM like machine learning. Running a game on it at 1080p is a waste. Even AMD 6800 is overkill since it gets 207fps @ 1080p highest with total system power from wall at ~300W.

M1 Ultra 64GPU is below 6700xt and probably closer to 6600xt with total system power in the low 200W.

6700xt benchmark

6600xt non-benchmark
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Yes, it’s a combination of bad PC code and Rosetta port, but if we look at it differently it becomes more understandable that the results are relatively not as bad as they look. I think it’s better to compare the results by percentage instead of trying to find comparable hardware. One way to look at it is to compare how demanding WH 3 is to another game on the same hardware. Fortunately Shadow of the Tomb Raider is another demanding game also in that video. Both games are also tested at 1080 ultra settings.

SOTTR 1080p Highest preset
RTX 3090 - 219 fps
M1 Ultra 64c - 112 fps
M1 Max 32c - 80 fps

Total War: Warhammer 3 1080p Ultra preset
RTX 3090 - 138 fps
M1 Ultra 64c - 69 fps
M1 Max 32c - 38 fps

GPU performance comparison between TW: WH 3 and SOTTR
RTX 3090 - 138/219 63%
M1 Ultra 64c - 69/112 61.6%
M1 Max 32c - 38/80 47.5%

This shows that both 3090 and M1 Ultra perform almost equally worse in WH 3 (-38%) and the game is equally demanding for both GPUs compared to other games like SOTTR.

Another way to look at it is to compare the performance of M1 to 3090 in those two games.

SOTTR 1080p Highest preset
M1 Ultra 64c vs. RTX 3090 - 112/219 51.1%
M1 Max 32c vs. RTX 3090 - 80/219 36.5%

Total War: Warhammer 3 1080p Ultra preset
M1 Ultra 64c vs. RTX 3090 - 69/138 50%
M1 Max 32c vs. RTX 3090 - 38/138 27.5%

This shows that M1 Ultra and Max perform almost equally good in both games compared to 3090. Even though their frame rates appear much lower than 3090 in WH 3 their performance is the same in both games compared to 3090, i.e. 30-50% of the performance of 3090. It’s the game’s source code and bad optimization and the non-native Rosetta port that cause the low frame rates on Macs. It becomes more obvious and unacceptable when the frame rate drops down to 38 on M1 Max but not so much to complain about when you have 80 fps in SOTTR.
1080P for a 3090 hit CPU limits quite quickly.
SOTTR at 4k on a 3090 gets nearly the same framerate as the M1 Ultra 64c at 1080P. Now that I am looking at some results (all from TechPowerUp) it looks like the situation is close to the same at 4k for TW:W3 as well.
 
  • Like
Reactions: Irishman

orionquest

Suspended
Mar 16, 2022
871
791
The Great White North
Not sure how these comparisons show anyting but the 3090 having almost double the performance then the ultra. To me that doesn't look good for the Ultra. Also a game like WHIII just came out no? So in a couple of patches and driver releases they will probably improve things on the PC side of things, will they on the Ultra as well? Either way that performance gap is huge!
 

JordanNZ

macrumors 6502a
Apr 29, 2004
779
290
Auckland, New Zealand
I have nothing against Feral, they do good work and have good customer support, but my experience with their games has been a hit and miss. Not really their fault, they have to work with crappy upstream code and ridiculous time constraints, but the experience for the end customer is not optimal.

Warhammer 3 port in particular is disappointing. I barely got 40fps on high full HD settings with my M1 Max, which is considerably less than Windows configs with comparable GPUs are getting.

I will point out that with Warhammer 3, they're having to emulate geometry shaders (with compute) while also combing that with tessellation. It's kinda amazing they got it to run as well as it does.
 
  • Like
Reactions: Irishman and Homy

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
I will point out that with Warhammer 3, they're having to emulate geometry shaders (with compute) while also combing that with tessellation. It's kinda amazing they got it to run as well as it does.
With how ancient DX11 is you would think the macOS version would outperform it.
 

JordanNZ

macrumors 6502a
Apr 29, 2004
779
290
Auckland, New Zealand
With how ancient DX11 is you would think the macOS version would outperform it.

It's emulating the legacy functions that slow things down. Feral don't re-write the engine in the games they port. They have their own translation tools.

Sometimes this works out well though. For instance Tomb Raider (2013) runs faster on macOS than the Windows version does on AMD cards.
 

mi7chy

macrumors G4
Oct 24, 2014
10,625
11,296
For instance Tomb Raider (2013) runs faster on macOS than the Windows version does on AMD cards.

Proof? MacOS gaming graphics API have always run slower and more janky than Windows APIs.

RX6800 @ 1080p default normal on Windows 10
1656702286927.png
 
  • Haha
Reactions: Romain_H

JordanNZ

macrumors 6502a
Apr 29, 2004
779
290
Auckland, New Zealand
Proof? MacOS gaming graphics API have always run slower and more janky than Windows APIs.

RX6800 @ 1080p default normal on Windows 10
View attachment 2025252
That’s a myth. At least since metal came along.

I’ll get some benches on my 16” MacBook Pro (with 5500) once I get a chance.

Most of the performance issues in macOS games comes from translating DX to metal. Back in the OpenGL days this was far worse. DX11 performance on AMD cards isn’t the best…

Developers that target metal directly get fantastic results with it that are absolutely on par with windows. Divinity original sin 2, and Baldur’s gate 3 are a couple of examples. Tomb Raider 2013 (with the metal backend) and Arkham City (metal backend), ‘The Witness’ was an early metal example that was faster on Intel graphics on macOS, World of Warcraft was on par, but has had a performance regression since 9.1… The API isn’t the problem.
 

jeanlain

macrumors 68020
Mar 14, 2009
2,461
955
Proof? MacOS gaming graphics API have always run slower and more janky than Windows APIs.
That's a ridiculous thing to say if it's based based on games that are first written for DX, then ported to the Mac using automatic translation tools.
DX drivers may even have game-specific optimisations that are not implemented in the Metal drivers. That doesn't mean Metal in general runs slower.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
That's a ridiculous thing to say if it's based based on games that are first written for DX, then ported to the Mac using automatic translation tools.
DX drivers may even have game-specific optimisations that are not implemented in the Metal drivers. That doesn't mean Metal in general runs slower.
Is there such thing as game specific metal driver optimizations? I didn't think it was possible on a Mac to do such a thing.
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
Proof? MacOS gaming graphics API have always run slower and more janky than Windows APIs.

RX6800 @ 1080p default normal on Windows 10
View attachment 2025252

It's fair to ask for "proof" but then you should also provide a source for your "proof" than just a screen shot. It's not about showing the fastest result you can find, but finding a benchmark running on the same hardware in both Win and MacOS. It's not easy to find "proof" for an old game like Tomb Raider but I found benchmarks for Shadow of the Tomb Raider running in Mac OS and Boot Camp on a Mac Pro.

The interesting result is the one at 4K where it's not CPU bound, just like you and others have been saying. While MacOS is not faster here it's practically the same numbers in both Win and MacOS.

Skärmavbild 2022-07-02 kl. 18.12.20.png
 
  • Like
Reactions: Irishman

Huntn

macrumors Penryn
May 5, 2008
24,003
27,086
The Misty Mountains
The challenge with the MacOS is finding native games to play. If you want basically the closest thing to a free hand, widest game selection options it’s with a PC or a Console.
 
  • Like
Reactions: Irishman

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
That's not a valid comparison since 3090 is more for specialized apps that benefit from huge amounts of high speed GDDR6X VRAM like machine learning. Running a game on it at 1080p is a waste. Even AMD 6800 is overkill since it gets 207fps @ 1080p highest with total system power from wall at ~300W.

M1 Ultra 64GPU is below 6700xt and probably closer to 6600xt with total system power in the low 200W.

6700xt benchmark

6600xt non-benchmark

1080P for a 3090 hit CPU limits quite quickly.
SOTTR at 4k on a 3090 gets nearly the same framerate as the M1 Ultra 64c at 1080P. Now that I am looking at some results (all from TechPowerUp) it looks like the situation is close to the same at 4k for TW:W3 as well.

Not sure how these comparisons show anyting but the 3090 having almost double the performance then the ultra. To me that doesn't look good for the Ultra. Also a game like WHIII just came out no? So in a couple of patches and driver releases they will probably improve things on the PC side of things, will they on the Ultra as well? Either way that performance gap is huge!

Then you should inform Andrew and LTT that their test is not valid. Of course it’s a valid comparison at those settings. My discussion wasn’t about how well 3090 can perform in certain apps that fully take advantage of its features. In that case you could also say ”That's not a valid comparison since M1 Ultra is more for native ARM64 apps that benefit from huge amounts of high speed unified VRAM and Apple Silicon, not unoptimized Rosetta games Like SOTTR made for dedicated AMD GPUs instead of Apple iGPUs.”

I was discussing the results in the LTT video regarding WH 3 and even tried to explain what my comparisons showed. It would be nice if LTT had tested all the games at 1440p or 2160p too like they did with WoW but they didn’t and those results at 1080p are what I had to compare with.

Again my point was that M1 Max/Ultra have 30-50% of 3090’s performance in both games despite the lower frame rates in WH 3 compared to SOTTR. When jeanlain wrote ”I had hopped that Apple GPUs would perform well on the latest Feral game (Total war Warhammer 3, which requires Apple Silicon), but halas, they don’t” it sounded as if M1 performs much worse than it actually does. M1 Ultra still has half the performance of 3090 but the more demanding and unoptimized a game is the more the frame rates will drop obviously. When the almighty 3090 goes from 219 in SOTTR to only 126 (-43.1%) in WH 3 at 1080p it’s not surprising that M1 Ultra goes from 112 to 63 fps (-43.8%).

It’s all about optimization, not M1. Because in the video we have also WoW:

WoW 2160p
RTX 3090 - 131 fps
M1 Ultra 64c - 115 fps
M1 Max 32c - 52 fps

M1 Ultra 64c vs. RTX 3090 - 115/131 87.8%
M1 Max 32c vs. RTX 3090 - 52/131 39.7%

With game optimization M1 Ultra’s performance increases from 50% to 88% of 3090’s performance and that is at 4K.

I also wonder how LTT got such a high frame rate with 3090 at 1080 in SOTTR. The Verge got 142 fps instead of 219. Their PC had i9-10900 and 64 GB RAM. LTT had i9-12900K and 32 GB RAM. As far as I can see there’s not much difference between those two CPUs in games with 3090, maybe 10-20 fps in some games, but certainly not 77 fps difference.

SOTTR 1080p
RTX 3090 - 142 fps
M1 Ultra 64c - 108 fps
M1 Max 32c - 86 fps
M1 Ultra 64c vs. RTX 3090 - 108/142 76.1%
M1 Max 32c vs. RTX 3090 - 86/142 60.1%

SOTTR 1440p
RTX 3090 - 114 fps
M1 Ultra 64c - 96 fps
M1 Max 32c - 62 fps
M1 Ultra 64c vs. RTX 3090 - 96/114 84.2%
M1 Max 32c vs. RTX 3090 - 62/114 54.4%


SOTTR 2160p
RTX 3090 - ? fps
M1 Ultra 64c - 60 fps
M1 Max 32c - 33 fps

So in their test M1 Ultra has 76-84% of 3090’s performance in SOTTR (a Rosetta game) instead of 51% as in LTT’s test. At 4K Ultra 3090 seems to get 90-95 fps (GPUCHECK, Notebookcheck, GPU-monkey). Regarding WH 3 Feral already released patch 1.0.1 for Mac/Linux which matches 1.2 for Win.

It would also be interesting to see what performance 3090 would get with only 200W like M1 Ultra. Remember that 200W is M1 Ultra's total system power draw, not GPU's power usage.
 
Last edited:
  • Like
Reactions: Irishman

mi7chy

macrumors G4
Oct 24, 2014
10,625
11,296
While MacOS is not faster here it's practically the same numbers in both Win and MacOS.

View attachment 2025537

Practically the same? Is that a joke? 107fps on MacOS to 142fps on Windows at 1080p ultra is huge. About every 20fps is a GPU model jump so you're getting two GPU upgrades by just switching OS. Not only that but Mac system is holding back those GPUs. Here's a two tier lower 6800 GPU @ 201fps 1080p ultra on PC system that's faster than 6900xt on Mac system @ 107fps 1080p ultra. At 4K it's GPU limited.

1656786288762.png
 
Last edited:
  • Haha
Reactions: JimmyjamesEU

Homy

macrumors 68030
Jan 14, 2006
2,510
2,461
Sweden
That's not a valid comparison since 3090 is more for specialized apps that benefit from huge amounts of high speed GDDR6X VRAM like machine learning. Running a game on it at 1080p is a waste.

1080P for a 3090 hit CPU limits quite quickly.

Practically the same? Is that a joke? 107fps on MacOS to 142fps on Windows at 1080p ultra is huge. About every 20fps is a GPU model jump so you're getting two GPU upgrades by just switching OS. Not only that but Mac system is holding back those GPUs. Here's a two tier lower 6800 GPU @ 201fps 1080p ultra on PC system that's faster than 6900xt on Mac system @ 107fps 1080p ultra. At 4K it's GPU limited.

View attachment 2025574

Yes, basically the same at 4K. Did you completely ignore that part which was my point?

So first you and others say running games like SOTTR at 1080p on a RTX 3090 or other powerful cards is not a valid test, it's a waste and CPU bound and I should look at results at 4K. Then when I show that you get the same result in MacOS and Win at 4K in the game on a 6900 XT you say the opposite, that I should look at 1080p and at 4K the game is GPU bound?

Strange logic! You really should make up your mind. Here you can see that 6900 XT is not limiting the game at 4K. The game uses 88% of the GPU at 4K ultra settings.

Skärmavbild 2022-07-02 kl. 22.20.07.png
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.