Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

EntropyQ3

macrumors 6502a
Mar 20, 2009
718
824
Performance per watt? Yeah it is good. Sadly nvidia has more or less moved the goal post these days so folks aren't really talking much about straight raster any more, everyone is on the RT + DLSS 3 bandwagon. So the actual performance improvement going from say a 3080 Laptop to a 4080 Laptop isn't that great when looking at raster and 1080p (where these newer high end GPUs are CPU limited at lower resolutions).
My bolding in the quote above (didn’t want to cut out the context).

The reason "everyone" is talking about new software features is that hardware capabilities are stagnating hard with the slowdown in lithographic performance improvement. The hardware vendors have to sell features and the tech sites that depend on advertising have to promote it. "Everyone" = "hardware commercial interests".
To support this Nvidia funds inclusion of their software features in high-profile games, but even now only 3-4 games out of my library of 250 in GoG and Steam support RT, and I don’t use it even there as I prefer to spend that performance potential elsewhere.

But what about the future? Well, we know the future of the underlying lithographic technology pretty well. The step from TSMC 5/4N to 3N to 2N is outlined for instance here. Note that the aggregate number of gates only grow by roughly 50% from 5N to 2N. And even if we account for GPUs having an unusually large proportion of logic gates we still end up in the 70% ballpark. Also 2N is scheduled to go into volume manufacturing in 2026, and if we’re lucky we might see that from Nvidia/AMD/... the year after. Since TSMC has talked openly about extending the cadence of their cutting edge processes to three years, 2N will be it until the end of the decade. So those that depend on hardware sales have to hype features, because watching paint dry for the rest of the decade will generate neither clicks nor advances in revenue.

Lithography slow down doesn’t mean much for the games industry in general since it is overwhelmingly dominated by software revenue. Nor has the hype surrounding new features been terribly successful in driving sales, in last months Steam hardware survey the total number of Nvidia RTX4xxx and AMD RX7xxx graphics adapters represented roughly a measly 1% of the total. Eventually that percentage will grow, obviously, if that is all people are offered but there is no sign that the public actually gives a damn. And apart from the splinter of us that look at PC hardware tech as entertainment in and of itself, why should they?
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
My bolding in the quote above (didn’t want to cut out the context).

The reason "everyone" is talking about new software features is that hardware capabilities are stagnating hard with the slowdown in lithographic performance improvement. The hardware vendors have to sell features and the tech sites that depend on advertising have to promote it. "Everyone" = "hardware commercial interests".
To support this Nvidia funds inclusion of their software features in high-profile games, but even now only 3-4 games out of my library of 250 in GoG and Steam support RT, and I don’t use it even there as I prefer to spend that performance potential elsewhere.

But what about the future? Well, we know the future of the underlying lithographic technology pretty well. The step from TSMC 5/4N to 3N to 2N is outlined for instance here. Note that the aggregate number of gates only grow by roughly 50% from 5N to 2N. And even if we account for GPUs having an unusually large proportion of logic gates we still end up in the 70% ballpark. Also 2N is scheduled to go into volume manufacturing in 2026, and if we’re lucky we might see that from Nvidia/AMD/... the year after. Since TSMC has talked openly about extending the cadence of their cutting edge processes to three years, 2N will be it until the end of the decade. So those that depend on hardware sales have to hype features, because watching paint dry for the rest of the decade will generate neither clicks nor advances in revenue.

Lithography slow down doesn’t mean much for the games industry in general since it is overwhelmingly dominated by software revenue. Nor has the hype surrounding new features been terribly successful in driving sales, in last months Steam hardware survey the total number of Nvidia RTX4xxx and AMD RX7xxx graphics adapters represented roughly a measly 1% of the total. Eventually that percentage will grow, obviously, if that is all people are offered but there is no sign that the public actually gives a damn. And apart from the splinter of us that look at PC hardware tech as entertainment in and of itself, why should they?
I am not saying I agree with the goal post move. I am just pointing out that it has.

Features like RT are probably going to take another 4 years to be really mainstreamed because folks are not upgrading their computers to be able to take advantage of it and consoles are leaving it behind a quality option that few actually choose to use. UE5 has a way around it with Lumen (not needing the hardware) but that implies folks will be okay with the performance hit it too brings. Unity doesn't seem to have any equivalent feature (I don't think any other engine does on first glance).
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
I am not saying I agree with the goal post move. I am just pointing out that it has.

Features like RT are probably going to take another 4 years to be really mainstreamed because folks are not upgrading their computers to be able to take advantage of it and consoles are leaving it behind a quality option that few actually choose to use. UE5 has a way around it with Lumen (not needing the hardware) but that implies folks will be okay with the performance hit it too brings. Unity doesn't seem to have any equivalent feature (I don't think any other engine does on first glance).
There is Path Tracing now too which further tanks FPS.
 

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
M2 chip has a better iGPU than the M1.

Apple expects macOS users to replaceir the M1 with a M3, M4 or M5.

Just like Sony and Microsoft expect users to replace the PS5 and XBOX series X with a PS6 and an XBOX series.

All around the Mac and console market, we have non-upgradable devices that only use UMA SoCs.

Maybe Apple learned that console makers had hit upon something good?
 

Irishman

macrumors 68040
Nov 2, 2006
3,449
859
The title lends itself to controversy. Why else are we "mincing" words?

I’m cool with his chosen phraseology.

It leads to more interesting thoughts and tangents and it keeps itself going on without much prodding or poking.
 

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
The reason "everyone" is talking about new software features is that hardware capabilities are stagnating hard with the slowdown in lithographic performance improvement.
The what? 🤯

What Is Moore’s Law and How Does It Impact AI

In short, Moore's Law states that the number of transistors on a microchip will double every two years, leading to exponential growth in computing power.

Exponential growth doesn't care that you can't wait for the M3. Or that a global pandemic delayed the release of M2 by a quarter. It's still exponential. Mankind doesn't know what to do with so much compute power and spending some billion transistors on better graphics is a no brainer. Stagnation! Jesus Christ.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
I’m cool with his chosen phraseology.

It leads to more interesting thoughts and tangents and it keeps itself going on without much prodding or poking.
I would have written the title more clearly if I had more characters. I would have written "In 3 years, 50% of all computers sold yearly capable of playing AAA games will be Macs".
 
  • Like
Reactions: Irishman

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
I would have written the title more clearly if I had more characters. I would have written "In 3 years, 50% of all computers sold yearly capable of playing AAA games will be Macs".
This assumes that triple-A gaming is a moving target. By now every new computer is expected to be good enough for office work, browsing and photoshop, even multitasking all of it. It would only be logical to predict that one day all computers will be good for gaming, unless you keep pushing up the graphics expectations indefinitely. There are just so many pixels and at some point you will have calculated all of them 120 times per second. Sooner or later the AI of NPCs will be the only thing which could still benefit from even more compute power.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
This assumes that triple-A gaming is a moving target. By now every new computer is expected to be good enough for office work, browsing and photoshop, even multitasking all of it. It would only be logical to predict that one day all computers will be good for gaming, unless you keep pushing up the graphics expectations indefinitely. There are just so many pixels and at some point you will have calculated all of them 120 times per second. Sooner or later the AI of NPCs will be the only thing which could still benefit from even more compute power.
I mean, isn't that the point of AAA games? I don't even know why you bring this up. 🤦‍♂️

Do games look 100% realistic yet? No, right? So AAA games will continue to improve in graphical fidelity.
 
  • Like
Reactions: Irishman

Gudi

Suspended
May 3, 2013
4,590
3,267
Berlin, Berlin
I mean, isn't that the point of AAA games? I don't even know why you bring this up. 🤦‍♂️
The point of gaming is to have fun and side-scrollers did this very successfully. Only with 3D gaming graphics became all important and playing became like watching interactive movies. When you go to the cinema, do you pick the movie with the most special effects? Or is there a point when the story becomes more important and bigger explosions can't keep you interested anymore?


If it's true for movies that the VFX look better when they are not treated special, then games will also be more fun when designers don't care as much about the graphics and concentrate on gameplay. In this way triple-A is a dead-end. Just make a good game and don't chase for spectacular graphics.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
The point of gaming is to have fun and side-scrollers did this very successfully. Only with 3D gaming graphics became all important and playing became like watching interactive movies. When you go to the cinema, do you pick the movie with the most special effects? Or is there a point when the story becomes more important and bigger explosions can't keep you interested anymore?


If it's true for movies that the VFX look better when they are not treated special, then games will also be more fun when designers don't care as much about the graphics and concentrate on gameplay. In this way triple-A is a dead-end. Just make a good game and don't chase for spectacular graphics.
During the mid 90s the term 'tech demo' was popularized.

When the audio visual has little to no story and tech was being showcased then it is called a tech demo.

But then again maybe that's what gaming has split into. Games that are tech demos and games with an actual story.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,665
OBX
  • Like
Reactions: Homy and Irishman

nasmdhgf

macrumors member
Jan 23, 2023
64
29
I'll stop saying it, if we can get release parity between PC and Mac. If the porting toolkit does that and my PC games are on Mac, then you win. But it's gonna come down to now actually getting people to use it, which good luck because a lot of game devs are still stanchly anti-Mac and anti-Apple due to Apple's business practices and how they expect everyone to move at the beat of their drum
Why do many developers reject Apple? Because some of Apple's practices have made developers feel dissatisfied. They force us to use metal in one way or another, and they don't listen to others' complaints. They think they are very smart, and there is also hardware incompatibility. For example, the CPU is based on the arm architecture, which is completely different from the X86. In fact, there is a lot of code in CPP running on the X86 architecture that is not a problem, But when it comes to ARM, there are inexplicable errors, and we have to search for them. This is what I dislike. Hardware is not something we often come into contact with, such as AMD or Nseries cards. It's Apple's own display chip, WTF!? I still have to put in a lot of effort to support them.
As a developer, I don't want to support them. Even if they spend a lot of effort to develop a Compatibility layer to support directX12, I will also check the system platform. If it is a MAC, I will make the game unable to start. To express my protest, because their sincerity is still insufficient. If they do not learn to respect the creator, they will be abandoned by the creator. It's like the OpenGL game developers left Apple back then. That's Apple's mistake.
 

nasmdhgf

macrumors member
Jan 23, 2023
64
29
Isn't Nintendo or Sony doing exactly the same: forcing you to use their APIs and take advantage of the specifics of their hardware?
So my game won't first consider logging into their platform. The most comfortable console for developers is the XBOX platform, rather than other closed, customized platforms. However, Nintendo is better than MAC because Nintendo supports OpenGL and Vulkan, and the number of game users on Mac makes me laugh loudly.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Isn't Nintendo or Sony doing exactly the same: forcing you to use their APIs and take advantage of the specifics of their hardware?
Yep. PS3 was insanely difficult to develop with the architecture, even for an indie. I attempted it. Yet we know how many games that system got.


It's never been about "Apple treating devs bad" as people like to just blame Apple for everything these days. It's purely about Marketshare. It starts and ends there. If Macs were at 90% like Windows is, it won't matter how "difficult" it is, we would see a lot of games for it since it is so popular.
 
  • Like
Reactions: Longplays

Homy

macrumors 68030
Jan 14, 2006
2,510
2,462
Sweden
the number of game users on Mac makes me laugh loudly.
According to Steam there were about 760,000 Mac users last month, only on Steam. Add Mac App Store and you have at least 1 million. If that's laughable because your game sells millions more it's understandable but i've heard a game is considered good when it sells a couple of hundred thousands.
 

Spaceboi Scaphandre

macrumors 68040
Jun 8, 2022
3,414
8,107
Isn't Nintendo or Sony doing exactly the same: forcing you to use their APIs and take advantage of the specifics of their hardware?

No because PlayStation and Xboxes are locked down PCs nowadays, using the same AMD CPUs and GPUs and architecture that gaming PCs use. It's why PlayStation has become the juggernaut it is today because they made development incredibly easy, while the platforms that aren't easy either have delayed releases, or don't get a game at all. Case in point the Nintendo Switch since it uses ARM and only has 4 gbs of RAM on an old outdated Nvidia Tegra X1
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.