Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Colstan

macrumors 6502
Jul 30, 2020
330
711
Hardware Unboxed does a review of M2. Get the tissue box ready.
This is amazing, I had no idea how much better PCs are than Macs. This is quite the Revelation. Do you have a pamphlet, testimonial, or perchance, a book of testaments, to help me spread the good news?

We on the other hand enjoy a gameplay with good story and characters and cool action. I mean who cares about jagged edges on a remote rooftop you can’t even see unless you lick the screen when you’re busy kicking ass as a superhero. Imagine Batman going around Gotham suddenly stopping in the middle of the action to complain about anti-aliasing instead of saving the city. Lmao 😄
Hush now, you're ruining the narrative. If folks are enjoying gaming on the Mac then benchmarks become meaningless, and obsessively posting them on this forum becomes a Sisyphean waste of time.
 
  • Haha
Reactions: Homy

Nugat Trailers

macrumors 6502
Dec 23, 2021
297
576
I'm of the somewhat strange, nay, perverse opinion that if a tower waaaay off in the distance is a bit jagged means... zip, unless the game's title is 'LET'S LOOK AT THE TOWER WAAAAY OFF IN THE DISTANCE SIMULATOR 2022'.

And that if you need to have tissues on standby to watch people reviewing Macs or PCs, you're probably too obsessed.

Benchmarks are useful, but they're not the be all, end all. All a lot of people care about is that a game has a decent framerate and looks relatively pretty, and that whatever company makes the magic box that shows said pretty images doesn't really matter as long as it looks good enough.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
It used to trail the windows version quite badly with performance. When Feral updated it with the metal backend, it leapt past the Windows version. At least on AMD hardware.
To be fair, on Windows it is a DX11 game and historically AMD hardware (or at least the driver optimizations) hasn't been that great with that API.
 

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
You see, that’s the difference between real gamers with OCD and us Mac noobs. They spend thousands of dollars on their gaming PC just to entertain themselves with zooming in with electron microscope on the furthest jagged edges of Gotham city. We on the other hand enjoy a gameplay with good story and characters and cool action. I mean who cares about jagged edges on a remote rooftop you can’t even see unless you lick the screen when you’re busy kicking ass as a superhero. Imagine Batman going around Gotham suddenly stopping in the middle of the action to complain about anti-aliasing instead of saving the city. Lmao 😄

You remember dedicated PhysX cards? After having fallen down a rabbit hole of games that had hardware accelerated PhysX as a feature, I finally had reminded myself that Batman: Arkhan City was one of the last titles that offered it. The effect certainly looked cool, but hardly realistic to my eye.

Imagine just how distracted the Caped Crusader would have been if he had been able to see those PhysX effects?!?

If I had to guess why PhysX failed, I’d put it at the feet of nVidia for not making it hardware agnostic so that by using open standards, any and all GPU vendors could do it. Just like why and how TressFX on AMD cards failed.
 
  • Like
Reactions: Homy

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
Hardware Unboxed does a review of M2. Get the tissue box ready.


LOL!! Did you link to the right video? Because this one doesn’t say anything that would make me need to reach for a tissue.

You did watch it didn’t you?
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
You remember dedicated PhysX cards? After having fallen down a rabbit hole of games that had hardware accelerated PhysX as a feature, I finally had reminded myself that Batman: Arkhan City was one of the last titles that offered it. The effect certainly looked cool, but hardly realistic to my eye.

Imagine just how distracted the Caped Crusader would have been if he had been able to see those PhysX effects?!?

If I had to guess why PhysX failed, I’d put it at the feet of nVidia for not making it hardware agnostic so that by using open standards, any and all GPU vendors could do it. Just like why and how TressFX on AMD cards failed.
Uh TressFX is open source. It is a competitor to Hairworks.
 
  • Like
Reactions: Irishman

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
Uh TressFX is open source. It is a competitor to Hairworks.

Thanks for the correction. I’m not gonna go down a TressFX or a Hairworks rabbit hole, so I’ll ask you:

Is TressFX still competing against Hairworks in new games now? And, do they require specific hardware to work correctly?

And lastly, is the market growing for either of them?
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
LOL!! Did you link to the right video? Because this one doesn’t say anything that would make me need to reach for a tissue.

You did watch it didn’t you?
It has more ALU's (moar TFLOPS!!), and thus is a stronger GPU in a compute heavy game like SOTTR. The CPU comparison doesn't really matter much here (at least for the AAA games conversation, IMO).
 
Last edited:
  • Like
Reactions: Irishman

Colstan

macrumors 6502
Jul 30, 2020
330
711
LOL!! Did you link to the right video?
On top of that, Vulcan just froze over, because Linus Tech Tips recently put out a video validating the entire Apple Silicon strategy:


Anthony systematically lays out why Apple's approach is going to supplant the entire PC model going forward. The days of hot, energy guzzling PCs are inevitably going to come to an end, as they hit a brick wall known as the laws of physics. Apple's vertical integration strategy is the future; it's painful for the PC crowd to endure such notions, as can be witnessed in the comments section.

Anthony points out that a Mac Studio with an M1 Ultra gets almost the same performance as a high-end PC. However, the entire Mac Studio has the same power consumption as the 12900K alone, before accounting for the other PC components. If LTT, the temple at which PC users worship, is saying such things, then I'm not sure what more needs to be said about Apple's forward-thinking technological superiority.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Thanks for the correction. I’m not gonna go down a TressFX or a Hairworks rabbit hole, so I’ll ask you:

Is TressFX still competing against Hairworks in new games now? And, do they require specific hardware to work correctly?

And lastly, is the market growing for either of them?
To the second no it doesn't require specific hardware.
For the First and Third question, it is likely any game that has "realistic" hair simulation that doesn't specifically call out Hairworks is likely using some form of TressFX since it has been open source for a while. https://gpuopen.com/tressfx/
 
  • Like
Reactions: Irishman

mi7chy

macrumors G4
Oct 24, 2014
10,623
11,296
On top of that, Vulcan just froze over, because Linus Tech Tips recently put out a video validating the entire Apple Silicon strategy:


Anthony systematically lays out why Apple's approach is going to supplant the entire PC model going forward. The days of hot, energy guzzling PCs are inevitably going to come to an end, as they hit a brick wall known as the laws of physics. Apple's vertical integration strategy is the future; it's painful for the PC crowd to endure such notions, as can be witnessed in the comments section.

Anthony points out that a Mac Studio with an M1 Ultra gets almost the same performance as a high-end PC. However, the entire Mac Studio has the same power consumption as the 12900K alone, before accounting for the other PC components. If LTT, the temple at which PC users worship, is saying such things, then I'm not sure what more needs to be said about Apple's forward-thinking technological superiority.

A disposable $5K Mac Studio with no user upgradable CPU, storage, GPU and no software that's twice as slow as a mobile GPU from <=$1200 laptop and slower than $520 CPU that use less power combined? 🤣😂🙃

GPU Blender BMW
16.39s - Nvidia 3060 70W mobile (GPU OptiX Blender 3.0)
34s - M1 Ultra 20CPU 64GPU (GPU Metal Blender 3.1)

CPU Blender BMW
1m35.21s - AMD 5950X (CPU Blender 3.0)
1m43s - M1 Ultra 20CPU 64GPU (CPU Blender 3.1)
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
A disposable $5K Mac Studio with no user upgradable CPU, storage, GPU and no software that's twice as slow as a mobile GPU from <=$1200 laptop and slower than $520 CPU that use less power combined? 🤣😂🙃
You didn't address anything that Anthony said in the video. The focus was on the future of the industry, not a handful of benchmarks that you cherry-pick, while ignoring the actual content. Feel free to demonstrate how Anthony, a technical professional who has a demonstrably non-partisan viewpoint, is wrong.
 

mi7chy

macrumors G4
Oct 24, 2014
10,623
11,296
That's just his opinion and not gospel plus LTT is trying to appease both sides of the fence. I'll stick to what's best for my workloads.
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
That's just his opinion and not gospel plus LTT is trying to appease both sides of the fence. I'll stick to what's best for my workloads.
I didn't ask you what is best for your workloads or what LTT's biases are. Those aren't germane to this discussion. I asked you why Anthony is wrong about the industry heading toward Apple's integrated approach, which he systematically details. That's twice now that you haven't addressed his premise, and you clearly watched the video.
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,460
Sweden
LOL!! Did you link to the right video? Because this one doesn’t say anything that would make me need to reach for a tissue.

You did watch it didn’t you?

He should practice what he preaches and learn some "fundamentals". Even in his favorite topic gaming M2 beats Radeon 680M in a non-native Rosetta game. "Quite impressive given how powerful the RDNA2 graphics is inside the 6800U". 😄

On top of that, Vulcan just froze over, because Linus Tech Tips recently put out a video validating the entire Apple Silicon strategy:


Anthony systematically lays out why Apple's approach is going to supplant the entire PC model going forward. The days of hot, energy guzzling PCs are inevitably going to come to an end, as they hit a brick wall known as the laws of physics. Apple's vertical integration strategy is the future; it's painful for the PC crowd to endure such notions, as can be witnessed in the comments section.

Anthony points out that a Mac Studio with an M1 Ultra gets almost the same performance as a high-end PC. However, the entire Mac Studio has the same power consumption as the 12900K alone, before accounting for the other PC components. If LTT, the temple at which PC users worship, is saying such things, then I'm not sure what more needs to be said about Apple's forward-thinking technological superiority.

The future is ARM and Apple understood it ahead of everyone after using it for over a decade.
 

mi7chy

macrumors G4
Oct 24, 2014
10,623
11,296
Apple: Future is 6502. DEAD
Apple: Future is 68K. DEAD
Apple: Future is PowerPC. DEAD
Apple: Future is AS. See a pattern?

x86-64 is suppose to be dead for the last several decades and here we are... 😂

1657219590001.png
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
The future is ARM and Apple understood it ahead of everyone after using it for over a decade.
That's fundamentally why our friend won't address Anthony's logical, well-reasoned argument that the entire PC industry needs to move to a power-efficient ISA and modern software stack. Windows is an archaic operating system built upon a crumbling foundation from the 1980s, x86 is an crufty relic derived from 1970s engineering. They're both going to collapse under their own weight unless something changes. Continually increasing wattage to make up for inherent inefficiencies isn't sustainable. The PC guys try with ever-increasing power usage, immutable laws of physics be damned, but it only works for so long.

To even acknowledge Anthony's premise would be an admission that building a PC is going to inevitably become a pastime, whether the hardcore PC crowd likes it or not. It's easier to bravely hide behind a keyboard and toss out random laughing face emojis. Which is amusing itself, because they're simply pumping up a user's reputation by increasing reaction scores. So, thanks for the smilies, keep 'em coming!
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
That's fundamentally why our friend won't address Anthony's logical, well-reasoned argument that the entire PC industry needs to move to a power-efficient ISA and modern software stack. Windows is an archaic operating system built upon a crumbling foundation from the 1980s, x86 is an crufty relic derived from 1970s engineering. They're both going to collapse under their own weight unless something changes. Continually increasing wattage to make up for inherent inefficiencies isn't sustainable. The PC guys try with ever-increasing power usage, immutable laws of physics be damned, but it only works for so long.

To even acknowledge Anthony's premise would be an admission that building a PC is going to inevitably become a pastime, whether the hardcore PC crowd likes it or not. It's easier to bravely hide behind a keyboard and toss out random laughing face emojis. Which is amusing itself, because they're simply pumping up a user's reputation by increasing reaction scores. So, thanks for the smilies, keep 'em coming!
In this instance are we saying PC == x86-64 && Windows OS?
 

Homy

macrumors 68030
Jan 14, 2006
2,510
2,460
Sweden
That's fundamentally why our friend won't address Anthony's logical, well-reasoned argument that the entire PC industry needs to move to a power-efficient ISA and modern software stack. Windows is an archaic operating system built upon a crumbling foundation from the 1980s, x86 is an crufty relic derived from 1970s engineering. They're both going to collapse under their own weight unless something changes. Continually increasing wattage to make up for inherent inefficiencies isn't sustainable. The PC guys try with ever-increasing power usage, immutable laws of physics be damned, but it only works for so long.

To even acknowledge Anthony's premise would be an admission that building a PC is going to inevitably become a pastime, whether the hardcore PC crowd likes it or not. It's easier to bravely hide behind a keyboard and toss out random laughing face emojis. Which is amusing itself, because they're simply pumping up a user's reputation by increasing reaction scores. So, thanks for the smilies, keep 'em coming!

He sees a pattern but misses the meaning. The future is about evolving, especially in the tech world. Apple has always found a better future with better solutions for its HW/SW and adapted its strategy. It has not always been easy to abandon the old technologies for its customers but that's the price of evolving. That's what those changes in CPU architecture means, find a better one. Very natural strategy even in the ordinary life. Meanwhile Intel has been treading water for years. As usual he also throws in a random screenshot because he thinks gaming is what his and everyone else's world is about and that's what companies build their future strategy upon.
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
In this instance are we saying PC == x86-64 && Windows OS?
Yes. At least with alternative operating systems like Linux and BSD, some of the cruft can be eliminated on the software side. Even then, there's still that old 1970s ISA sitting deep within the bowls of every modern Intel and AMD chip. For those who are interested, the engineer who wrote the draft for x86-64 and worked on Opteron, Cliff Maier, explains why x86 sucks compared to Arm.

A sample quote:
So what you have with x86 is an instruction set architecture that was fundamentally designed to optimize for problems we don’t have anymore.
I'm sure many folks here are familiar with him, but for those who aren't, it's worth a read.
 

mi7chy

macrumors G4
Oct 24, 2014
10,623
11,296
Maier was pre-Ryzen that almost BK'd AMD. Of course he's going to have some bitterness towards Jim Keller and Ryzen success. 6000U on older node is trading blows with AS without throwing compatibility with the largest software ecosystem out the window. Majority aren't ready to give that up and/or they need to scale performance beyond mid range to high end consumer 5950x, workstation Threadripper or to top supercomputer Epyc that is nearly triple the performance at less power than ARM competitor.

https://www.top500.org/lists/top500/2022/06/
1657230381060.png
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
Of course he's going to have some bitterness towards Jim Keller and Ryzen success.
Cliff has no bitterness toward Jim Keller or Ryzen. In fact, he says that Keller is a "smart guy" and defended him when I said something critical about Keller. He's also positive about his days at AMD and hasn't said anything bad about Ryzen. He isn't a fan of Intel, given his time at AMD, but that has nothing to do with his thoughts on x86 vs. Arm. It's about the technology, not the personalities.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
Care to quote someone else sharing your opinion on their work? This should be easy, since Feral is apparently "known" for its crappy ports.
Sure, it's usually discussed after sessions at Siggraph, GDC, GTC and the like. Last time it came up and I was there was after the GTC E32776 session when someone asked what the partnership with Nvidia for the game AI meant for multi platform titles.
"Titles that don't look good"? What about RoTR, Shadow of Mordor or Alien Isolation? These were quite good looking when they were released.
Not really.
Anyhow, given that no other house ports AAA games to the Mac, I'm not sure what your reference is.
There have been other companies porting games and even to date some companies hire teams to do in-house porting. And why exactly is it that we need a reference to compare to? This could be done by metrics, QA and other ways without having a reference.
If you don't care about honesty, they you should stop speaking from authority here. No one here can contradict you about game development except Leman (and he just did).
Either you have a bunch of typos in there or maybe read again what I said? I don't care about what flat-earthers are trying to tell me either. Leman didn't contradict anything, he stated his opinion when the gaming world clearly does things differently, even if that doesn't fit the hope of Mac gaming fans. It's very easy to proof someone wrong, bring it up at the usual conferences, forward it to game developers which will in turn communicate it to everyone working for and with them.
So first you keep saying that it’s not cheap or profitable for devs including yourself to port AAA PC games to an additional platform like Mac and that’s why there is no interest among the devs and nobody even mentions Mac at dev conferences and among the people you know but now it’s cheaper and profitable if devs build their PC games with additional platforms like Mac in mind from the start?
Try not to put things out of perspective, people talk about Macs at conferences all the time, they also talk about games on Raspberry Pies and even Arduinos. From a pure technical point of view. No one is however seriously talking about real development effort that is costly to begin with. Porting can be cheap or very expensive depending on how it's done and to what degree people go.
Why don’t all those devs saying it’s not worth it just build their games with the Mac in mind from the start then to make it profitable like the popular AAA franchises I mentioned?
They don't build their games with Mac in mind. They build their engines with multiplatform in mind with no specific optimization for any platform.
You and other devs here have also said several times that porting even simple Unity games to Mac is not just like flipping a switch. It takes lots of debugging, optimizing and customer support and it’s not worth it. Now it’s suddenly a matter of flipping a switch when it comes to AAA franchises like Tomb Raider, Deus Ex and Metro?
That depends on features used in the engine. How much it costs depends on featureset and how much QA studios actually do (see XCOM port). Flipping a switch in this context means rather cheap. Nothing is ever free.
nd those games have simple engines and graphics? Shadow of the Tomb Raider, Deus Ex MKD and Metro Exodus? Really?
Really! I'm going to pick just one. How exactly does Tomb Raider look so good? The one thing that looks good is wet surface/skin. The rest is rather meh, look at the dry skin or surfaces, look at hair and hair physics. Trees, bushes, etc. that all looks rather bad. What we get with Tomb Raider is what we get with many games, using psychology. Draw attention to one thing that looks good and away from anything that doesn't look good. That's not really new and done all the time, people went nuts when God of War 3 came out and said how good it looks. They only thing that looked really good was lighting with different colors and the powerup orbs flying around illuminating everything. The rest was rather poor. We've seen similar tricks used in games like Uncharted because the PS3 was very difficult to program for and had severe shortcomings for specific features. But it's not even exclusive to games, it's used in movies all the time as well, Spielberg, JJ.Abrams, etc.
The devs must have made the decision for the first engine after careful consideration meaning they found the Mac’s small user base still profitable despite the huge costs of porting the engine and the AAA games.
No, as said above, they didn't make the decision for the Mac, but for multiplatform without any specific optimization.
That was when they were developing Shadow of the Tomb Raider.
Which was developed for DX11 first and then DX12 features were added later, not just for RTX.
In fact in an interview with Capcom’s lead programmer Tomofumi Ishida he explained how they had to build a brand new engine since MTFW was not useful anymore for Resident Evil 7.
You're sure trying to read a lot into things when there's nothing to read into. RE is new just the way Windows 11 is new, macOS Venture is new and UE5 is new and... in other words, it's not new, but still based on the same old engines with changed, abandoned and new features. Heck, you can run both engines side by side and see they're still doing the exact same thing for some things. Just like with Windows 10 vs 11, UE4 vs UE5, Monterey vs Venture and so on.
Are these awards for Ferals ports or for the games in general, because I am not sure how Feral would have any bearing on the awards given out. Do they collaborate with the IP owners and make changes to the games in a material way?
Awards are just that, awards. What they actually mean is usually not understood by people who never received some or are not involved in the industry. I've won my share of awards, including Apple and can say "design" doesn't always mean what the name suggests. Then you have the issue of nomination, sometimes you have to pick the lesser of the evils when only bad stuff is nominated. Then you might have financial interest in awards... I've won awards for work I've not been proud off, ashamed even. Yet it won an award because I knew people in the commitee and they wanted to work with me, so they tried to do me a favour. Same for political reasons. We've seen academy awards handing out awards to afro americans or female artist not because of performance, but as a statement for BLM, MeToo or similar social movements.

So what does an award for Texturing actually mean? That someone sat down with Photoshop and made some nice textures. Doesn't say anything about the game. One could take a photo and use it as a texture... there's the award for most photorealistic texture.

What does an award in Game Design actually mean? Could be level design alone? Could it be the setting (in Earth, moon, etc.)? These awards need to be understood to interpret correctly.

Besides were there any other titles in these categories? These awards sounds like "just for showing up" type awards.
Most awards are like that.
I don’t know whether to blame Feral or Firaxis but XCOM2 ran terribly in my experience. Also there was a graphical bug after some update where Playstation buttons would show in menus.
You must be mistaken, we've been told they're doing a great job with this. So something like this, which is a QA issue, surely can't happen. ;) But yes, I played XCOM on the Mac... for a while.
That’s beside the fact that XCOM2 is just a bad game in general.
Well, subjective, but it's not as good as the original games in my opinion.
JordanNZ's pic is a compressed jpeg while the other one is a png. How do you think lossy compression works? Lol
Lossy compression surely doesn't introduce jaggies... it's introducing blocks at well defined size in this case 8x8 during the DCT and quantization stages. And while one can create jaggies from blocks, one can't always create blocks from jaggies.


Moral of the story, this has become like talking to flat-earthers. Fans will always believe the next big thing is around the corner when there isn't much. And if it's so great, we'd all enjoy those Ubisoft, EA, Blizzard, Rockstar and other games on our Macs. And yet the crickets are chirping. In the meantime, I'll fire up and old Mac and replay Star Trek Judgement Rites which is something I've been wanting to do since the 90s. So I guess that's my share of AAA gaming I'll get for now.


EDIT: I've been busy with a really small part of a project the past two weeks, involving Apple hardware. That's going to spark a new thread come winter (or maybe spring, depending on progress which is out of my hands). So we can have the same discussion all over again after the keynote. ;)
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.