Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
The market for digital still cameras have dropped to a little over 8 million.

This is similar in size to the PC workstation market at under 7.7 million.

R&D effort chasing the "last 1% of performance" or the law of diminishing marginal returns.

Not really up there in terms of priorities.
I’m not sure how you can argue that the gaming market is significantly bigger than the camera market.

Gaming is continuing to grow every year while cameras are shrinking.
The activity of gaming may change little but the hardware people will be doing it on, will.

Gamers will be very unhappy but APUs, SoCs and Heterogeneous System Architecture is the only direction for the gaming PC going forward.

Reason being, Moore's Law.
We are talking about game consoles. All game consoles use an APU.

A M2 Max in desktop without PCIe slots is $2k.
The assumption is that Apple will make a lot more money from taking a commission from game sales.

That’s how game consoles work economically.

And of course, we are going to assume that you can’t run macOS or anything on an Apple Console.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Watch this video


It covers everything this thread is talking about.

I think what you're trying to say is that companies are building accelerators, such as ASICs, instead of putting transistors into the CPU?

By the way, the video is wrong. The CPU is still following Moore's law. Check AMD's Epyc CPUs for the perfect examples. The doubling of transistors every 2 years for a CPU is still going. What isn't doubling is the CPU's single core speed, which stopped scalling.

It has nothing to do with our discussion.
 
  • Like
Reactions: Basic75

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
@senttoschool What kind of notebooks does IDC consider to be gaming notebooks? Would IDC consider a notebook with an AMD 780m GPU to be a gaming notebook?
 

jmho

macrumors 6502a
Jun 11, 2021
502
996
What does this even mean?
When you're using a 4090, the main issue is working out how to stop the CPU from slowing the GPU down.

Eventually the solution is likely to be to put the CPU on the 4090 and cut out all this PCI-e nonsense.

His phrasing about physics is slightly confusing, but fundamentally if you half the distance between two parts of a computer - like the CPU and GPU, you half the time it takes for an electrical signal to travel between them.
 
  • Like
Reactions: Longplays

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,663
OBX
They'll hit Moore's Law.

APUs, SoCs and Heterogeneous System Architecture were a pseudo workaround on physics.

What I am pointing out is that tech not based on the above 3 will be outclassed into the near future largely because of cost & performance per watt.

This tech's being implemented on the Switch, Xbox and PS5 now.

PC gaming modularized nature will work against it as to get better performance you need to increase input power.

In the EU there's new regulation putting a cap on allowed power consumption on TVs. If that were to translate to desktops and even laptops then the dGPU and CPU will be capped as well.
IIRC California already has a power tax on computers.

For consoles APUs are the way to go, though for consoles they tend to go with older architectures to lower the prices and keep using the same hardware for a long time to maximize hardware usage and costs.

I am not sure Apple would do well with a fixed home console (though we desperately need another hardware vendor), maybe they could pull a Steam Deck/ Asus Ally/ Nintendo Switch instead.
 
  • Like
Reactions: Longplays

Longplays

Suspended
May 30, 2023
1,308
1,158
I’m not sure how you can argue that the gaming market is significantly bigger than the camera market.
I'm not saying that. I am saying that to go for the lunch of Nintendo, Playstation and Xbox would require more resources than it is worth.

It may be simpler to just buy Nintendo. Similar to how Microsoft bought Activision Blizzard or Sony buying Bungie.

Being the 4th video console brand does not work for say Sega or Atari even if your market cap peaked at $3 trillion.
Gaming is continuing to grow every year while cameras are shrinking.
On other hardware other than PC or game consoles like smartphones, tablets, AR headsets...
We are talking about game consoles. All game consoles use an APU.
I pointed that out as this thread was about Apple Silicon laptops & desktops. Seeming Mac gaming is often compared to PC gaming then my response expands to that as well.
The assumption is that Apple will make a lot more money from taking a commission from game sales.
Which the $AAPL shareholders care about.
That’s how game consoles work economically.
Apple's successful without the business model of PS or Xbox. Little to no exposure like Nintendo.
And of course, we are going to assume that you can’t run macOS or anything on an Apple Console.
So a $2k M2 Max that only plays games and cannot run macOS.

Assuming this was released for $500 and the download-only games are $60 with infrequent App Store sales would have Apple needing to sell more than 25 games to cover the cost of a Mac Studio.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
I think what you're trying to say is that companies are building accelerators, such as ASICs, instead of putting transistors into the CPU?

By the way, the video is wrong. The CPU is still following Moore's law. Check AMD's Epyc CPUs for the perfect examples. The doubling of transistors every 2 years for a CPU is still going. What isn't doubling is the CPU's single core speed, which stopped scalling.

All chips follow Moore's Law. What I and the video are pointing out is that there are methods to increase performance even with Moore's Law limiting conventional means of performance improvements of die shrinks

It has nothing to do with our discussion.

Your thread is about Windows games working on Apple Silicon Macs. Naturally there will be comparison of i9s and RTXs of the PC.

Those two chips are not SoC, APUs and Heterogeneous System Architecture. They will hit a wall in raw performance in the near future when Apple releases newer SoCs.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Those two chips are not SoC, APUs and Heterogeneous System Architecture. They will hit a wall in raw performance in the near future when Apple releases newer SoCs.
And why do you think Nvidia's GPUs will hit a performance wall before SoCs?
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
All chips follow Moore's Law. What I and the video are pointing out is that there are methods to increase performance even with Moore's Law limiting conventional means of performance improvements of die shrinks
Do you mean building out accelerators?

What do you think a GPU is? 🤦‍♂️
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
When you're using a 4090, the main issue is working out how to stop the CPU from slowing the GPU down.

Eventually the solution is likely to be to put the CPU on the 4090 and cut out all this PCI-e nonsense.

His phrasing about physics is slightly confusing, but fundamentally if you half the distance between two parts of a computer - like the CPU and GPU, you half the time it takes for an electrical signal to travel between them.
So basically, the point is that if you have a CPU and a GPU, it's faster if you glue both of them together into a single SoC than if they're separated by PCI-E, right?

That's true. It's also more efficient. Hence, why consoles are all APUs. But it isn't always better - hence why we have discrete CPUs and GPUs.

What does this have to do with Moore's law?
 

Longplays

Suspended
May 30, 2023
1,308
1,158
Do you mean building out accelerators?

What do you think a GPU is? 🤦‍♂️
On the SoC.

1st post was about Apple laptops and desktops being able to play Windows games. These have Apple SoC.

Typical Windows gaming PCs are on separate CPUs & separate dGPUs.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Just watch the video and understand the performance advantages of SoCs.
I know all the advantages and disadvantages of SoCs. I've been following CPU/SoC/GPU industry for about 25 years now.

I don't have a problem with saying that SoCs are more efficient in general than discrete CPU + GPU. I have a problem with the statements below.

They'll hit Moore's Law.

APUs, SoCs and Heterogeneous System Architecture were a pseudo workaround on physics.

What I am pointing out is that tech not based on the above 3 will be outclassed into the near future largely because of cost & performance per watt.

This tech's being implemented on the Switch, Xbox and PS5 now.

PC gaming modularized nature will work against it as to get better performance you need to increase input power.

In the EU there's new regulation putting a cap on allowed power consumption on TVs. If that were to translate to desktops and even laptops then the dGPU and CPU will be capped as well.
1. No, Nvidia won't hit Moore's law before Apple does. In fact, Moore's Law isn't even a law. It's just an estimation of transistors in an IC while looking back historically.

2. No, APUs, SoCs aren't designed to be workaround on physics. No, they aren't designed to be work around for Moore's law either. SoCs are designed for devices where modularity and raw power does not matter more than efficiency.

3. APUs have been used on consoles since as far back as PS2, Gamecube, Xbox. I'm sure it was used prior to that as well. It's not a new thing that the Switch, Xbox, PS5 are using. Intel has been putting a GPU in a CPU since 1998.

4. PC modularization is because gamers prefer to swap out parts as they need it. In turn, they are willing to trade our efficiency.

Nvidia just became a $1 trillion company. You didn't figure out something from one Youtube video that everyone missed.
 
  • Like
Reactions: Basic75

Longplays

Suspended
May 30, 2023
1,308
1,158
I know all the advantages and disadvantages of SoCs. I've been following CPU/SoC/GPU industry for about 25 years now.

I don't have a problem with saying that SoCs are more efficient in general than discrete CPU + GPU. I have a problem with the statements below.


1. No, Nvidia won't hit Moore's law before Apple does. In fact, Moore's Law isn't even a law. It's just an estimation of transistors in an IC while looking back historically.

2. No, APUs, SoCs aren't designed to be workaround on physics. No, they aren't designed to be work around for Moore's law either. SoCs are designed for devices where modularity and raw power does not matter more than efficiency.

3. APUs have been used on consoles since as far back as PS2, Gamecube, Xbox. I'm sure it was used prior to that as well. It's not a new thing that the Switch, Xbox, PS5 are using. Intel has been putting a GPU in a CPU since 1998.

4. PC modularization is because gamers prefer to swap out parts as they need it. In turn, they are willing to trade our efficiency.

Nvidia just became a $1 trillion company. You didn't figure out something from one Youtube video that everyone missed.
I'm talking from a wanted outcome point of view.

Apple and other brands are not following the PC modularized model to get better raw performance, performance per watt, battery life and power consumption.

The way Intel CPUs and Nvidia dGPUs are being done now will have difficulty competing if they do not change.
 
  • Like
Reactions: holisticrunner

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
I'm talking from a wanted outcome point of view.

Apple and other brands are not following the PC modularized model to get better raw performance, performance per watt, battery life and power consumption.

The way Intel CPUs and Nvidia dGPUs are being done now will have difficulty competing if they do not change.
On the laptop side, I would have to agree that over the run, it's probably better to use a SoC for gaming. Right now, Nvidia does not have a consumer CPU that can compete. They tried to buy ARM but failed. AMD does but their GPU solution is subpar. Intel has both subpar CPUs and GPUs right now.

On the desktop side, I think modular is the right approach.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
1. M1 Max and Ultra are barely close to mobile and desktop version of RTX 3060 and they are expensive. Beside, players can get way better performance in low price from PC. If you have money for MBA, then you probably can get a laptop with way better performance than MBP less than $2000 or even $1000.
2. Mac is not a console and comparing to PC is more proper. As I mentioned above, M1,2 are not even close to RTX 3060. M1 Max is barely close to it.
3. Limited amount of game players = no profits
4. DirectStorage, Ray tracing, DLSS, overclocking, and more.
5. Like I said, the platform itself is extremely hostile to gaming. Intel Mac era is a great example. Quite a lot of games supported Mac but didnt really improve the situation. Beside, is there any other Apple Silicon based platform other than Mac app stores? Quite a lot of people hate to pay fees on App Store and you need to keep updating for many macOS every year which is another huge problem.
7. Both Switch and PS5 develop games from PC. What else then?
Can you stop spreading ill informed opinions?


In GFXBench 5.0 Aztec Ruins offscreen M2 Max 38 core GPU is close to RTX 4080 laptop GPU(4080 is 17% faster)
In GFX Bench 5.0 4K Aztec Ruins Offscreen - M2 Max is close, again, to RTX 4080 laptop GPU(4080 is 14% faster).

M1 Pro 16 core was close to RTX 3060 Laptop GPU in GFX Bench, while M1 Max was as fast as 3080 laptop GPU. Which can be seen here:
And here:

Based on leaked GFX Bench scores, and this is graphics workload, NOT COMPUTE, M2 Ultra is 25% faster than RTX 4090 desktop GPU.
Now here is the question. Why one software is working good on Mac hardware while other doesnt?

Its only because of LACK OF OPTIMIZATION of software that we get performance that is sub par compared to current generation of GPUs.

People told you this before, and you ignore that, and base your flawed opinion based on Geekbench compute scores.

Please, stop spreading misinformation.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
So basically, the point is that if you have a CPU and a GPU, it's faster if you glue both of them together into a single SoC than if they're separated by PCI-E, right?

That's true. It's also more efficient. Hence, why consoles are all APUs. But it isn't always better - hence why we have discrete CPUs and GPUs.

What does this have to do with Moore's law?
In a world with increasing AI workloads - you need the lowest possible latency on the system, for in real time processing of Input.

Expect that whole computing world will go APU/SOC route, and Apple was just the "visionary" company which went this way first.

dGPUs and DIY market as we know - most likely will morph in upcoming years.
 
  • Like
Reactions: holisticrunner

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
Just as productive laptops can play games, gaming laptops can be productive, too. You, like many Mac users, want a Mac you can play games on. But I doubt a gamer would choose an MBP over a gaming PC laptop because it has better productivity tools. Gamers would choose a laptop that offers high FPS at an affordable price, and so far, the MBP can't compete on that.

Except the majority of gamers go for performance over price considerations. That is a big reason why Lenovo and HP have invested so heavily on the gaming side, Dell simply bought Alienware to enter that side of the market. System Integrators such as iBuyPower, CyberPC, Falcon Northwest, etc. have locked up a sizeable portion of the gaming market as well. There is a reason gaming-focused laptops tend to go for around $1500 and up, versus the plethora of sub-$1000 laptops on the non-gaming side of things.
 
  • Like
Reactions: Xiao_Xi

dmccloud

macrumors 68040
Sep 7, 2009
3,138
1,899
Anchorage, AK
The market for digital still cameras have dropped to a little over 8 million.

This is similar in size to the PC workstation market at under 7.7 million.

R&D effort chasing the "last 1% of performance" or the law of diminishing marginal returns.

Not really up there in terms of priorities.

The activity of gaming may change little but the hardware people will be doing it on, will.

Gamers will be very unhappy but APUs, SoCs and Heterogeneous System Architecture is the only direction for the gaming PC going forward.

Reason being, Moore's Law.

A M2 Max in desktop without PCIe slots is $2k.

Apple is better off making Animojis.

1. PC Gaming is growing, digital cameras are shrinking. That's a proven fact.

2. APUs are used in consoles, NOT in gaming PCs. You will not find a PC seriously focused on gaming with an APU instead of dedicated CPU and GPU. In fact, AMD has moved away from even integrating graphics into the bulk of its processors for this very reason, DESPITE building the APUs for the Xbox Series X/S and PS5. Two entirely different use cases with different expectations from the respective user bases.

3. The presence (or lack therof) of PCIe slots is a red herring in this case, given the existing graphics performance of Apple Silicon. As Apple keeps iterating on their SoCs, that performance will only get better, despite using a fraction of the power that Intel and AMD systems consume.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
2. APUs are used in consoles, NOT in gaming PCs. You will not find a PC seriously focused on gaming with an APU instead of dedicated CPU and GPU. In fact, AMD has moved away from even integrating graphics into the bulk of its processors for this very reason, DESPITE building the APUs for the Xbox Series X/S and PS5. Two entirely different use cases with different expectations from the respective user bases.
This actually made me chuckle.

Especially considering what Intel is doing with Meteor Lake and Arrow Lake architectures.
 

Longplays

Suspended
May 30, 2023
1,308
1,158
On the laptop side, I would have to agree that over the run, it's probably better to use a SoC for gaming. Right now, Nvidia does not have a consumer CPU that can compete. They tried to buy ARM but failed. AMD does but their GPU solution is subpar. Intel has both subpar CPUs and GPUs right now.

On the desktop side, I think modular is the right approach.
1. PC Gaming is growing, digital cameras are shrinking. That's a proven fact.
Yes for laptops but not desktops.


With laptop tight spaces and battery size contraints then a SoC or even APU will be applicable.

Separate CPU & separate dGPU are growing ever smaller annually. Eventually they will be a niche like mainframes.

When ARM laptops are introduce end of year it will start eating into x86 laptop sales.
 

senttoschool

macrumors 68030
Original poster
Nov 2, 2017
2,626
5,482
Yes for laptops but not desktops.


With laptop tight spaces and battery size contraints then a SoC or even APU will be applicable.

Separate CPU & separate dGPU are growing ever smaller annually. Eventually they will be a niche like mainframes.

When ARM laptops are introduce end of year it will start eating into x86 laptop sales.
I was referring to the total gaming industry.

i2iBziU4HcH2FYPWrkplD8QJsbw8v7kgJ7Dcaapg02o.png


Compared to the camera industry.

NQNbZyN2rOyGFCpffaZrdwGmmatmgGNoQeGXpE-Qczs.jpg


Based on this, do you think it's more likely that Apple makes a gaming console or a standalone camera?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.