Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,521
19,673
The Qualcomm/ARM Android chips that are 5nm/7nm/10nm are superior to any Intel/AMD chip on a basis of performance per watt.

Are you sure about that? One would need to look at the numbers but from what so remember the efficiency of newer ARM cores was similar to Intel Gracemont. Fairly sure Zen3+/4 won’t lose the efficiency battle either.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Are you sure about that? One would need to look at the numbers but from what so remember the efficiency of newer ARM cores was similar to Intel Gracemont. Fairly sure Zen3+/4 won’t lose the efficiency battle either.
Intel/AMD chips are largely laptop/desktop chips. They are not under the same system constraints as a smartphone.

With/Without looking at the numbers Android chips are between Apple & AMD solely based on node.

Intel Gracemont are on Intel 7 and will be released some time in Q1 2023. AMD's on 5nm since last year and Apple's on 4nm for the A16 Bionic & sampling chips on 3nm since last year as well.

Apple leaving Intel made them suddenly be able to produce 7-10nm chips. Being a laptop/desktop chip monopoly from 2006-2020 provided no incentive to move out of 14nm from 2014-2020.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Intel/AMD chips are largely laptop/desktop chips. They are not under the same system constraints as a smartphone.

With/Without looking at the numbers Android chips are between Apple & AMD solely based on node.
IMHO, the classification of CPU/Soc as for laptop/desktop or smartphone is silly.

I think Apple has achieve something incredible where their SoC architecture can scale from powering their AirPods all the way to the Mac Studio, and soon the Mac Pro.

AMD is now on the same process node as the A15/M2, but I don't think their CPU/SoC architecture can scale like how Apple's SoC can scale the gamut of devices that it is powering.

IMHO Apple most definitely can produce something that can be king of the hill, but the question is whether Apple wants to, not whether Apple can.

I just SMH whenever I see folks saying that Apple Mx SoCs are phone chips.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
IMHO, the classification of CPU/Soc as for laptop/desktop or smartphone is silly.

I think Apple has achieve something incredible where their SoC architecture can scale from powering their AirPods all the way to the Mac Studio, and soon the Mac Pro.

AMD is now on the same process node as the A15/M2, but I don't think their CPU/SoC architecture can scale like how Apple's SoC can scale the gamut of devices that it is powering.

IMHO Apple most definitely can produce something that can be king of the hill, but the question is whether Apple wants to, not whether Apple can.

I just SMH whenever I see folks saying that Apple Mx SoCs are phone chips.
The point I am making is that the design targets of Apple SoC put limits/restrictions that prioritizes performance per watt efficiencies because smartphones have itsy bitsy batteries.

Unlike say Intel/AMD where in they just increase input power to increase performance without going the expensive way of paying for the leading edge node. They can do that because laptops have a max rating of nearly 100Wh and desktops are limited by the wall outlet.

AMD started 5nm late last year when A16 Bionic's on 4nm? Last month TSMC's 3nm node started its fab run.

AMD is essentially Apple in Jan 2021 when Apple's in Jan 2023.

Before anyone says it, yes nodes are not the end all be all. The design targets of the chip, # of transistors and other variables come into play. But to simplify things with an emphasis on performance per watt I just make comparisons based on die shrink sizes.

It is like comparing a 2011 MBP 13" with a 45nm chip vs a 2023 MBP 16" with a 5nm chip. The year difference jumps at you as a dozen years old compounded by the node of the chips used in them.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
The point I am making is that the design targets of Apple SoC put limits/restrictions that prioritizes performance per watt efficiencies because smartphones have itsy bitsy batteries.

Unlike say Intel/AMD where in they just increase input power to increase performance without going the expensive way of paying for the leading edge node. They can do that because laptops have a max rating of nearly 100Wh and desktops are limited by the wall outlet.

AMD started 5nm late last year when A16 Bionic's on 4nm? Last month TSMC's 3nm node started its fab run.

AMD is essentially Apple in Jan 2021 when Apple's in Jan 2023.

Before anyone says it, yes nodes are not the end all be all. The design targets of the chip, # of transistors and other variables come into play. But to simplify things with an emphasis on performance per watt I just make comparisons based on die shrink sizes.

It is like comparing a 2011 MBP 13" with a 45nm chip vs a 2023 MBP 16" with a 5nm chip. The year difference jumps at you as a dozen years old compounded by the node of the chips used in them.
Not really. Apple already uses the AS SoC in all their gadgets for years, not just starting with 5nm in 2020.

The issue is that AMD and Intel (and the entire industry) has always thought (as least that's what I think) of power efficiency as an after thought, because consumers accepted this fact, until the AS Macs came out. You can engineer to either extremes or you can try to strike a balance. Engineering is always about trade-offs.

Many have been saying that AMD's latest Zen 4 CPU caught up with AS in power efficiency. But many forgot that the CPU/SoC is just one part of the entire package. If Zen 4 CPU pairs up with SODIMMs, they will be slower and consume more power. Then there's the dGPU to make the graphics good, and yet more power. At the end of the day it is how the entire package is designed and built, and that I think is what Apple excels at, and I would say they are getting better at it. We'll have to wait until actual product using the Zen 4 solutions to see if the industry has really caught up to AS Macs in terms of power efficiency.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
Not really. Apple already uses the AS SoC in all their gadgets for years, not just starting with 5nm in 2020.

The issue is that AMD and Intel (and the entire industry) has always thought (as least that's what I think) of power efficiency as an after thought, because consumers accepted this fact, until the AS Macs came out. You can engineer to either extremes or you can try to strike a balance. Engineering is always about trade-offs.

Many have been saying that AMD's latest Zen 4 CPU caught up with AS in power efficiency. But many forgot that the CPU/SoC is just one part of the entire package. If Zen 4 CPU pairs up with SODIMMs, they will be slower and consume more power. Then there's the dGPU to make the graphics good, and yet more power. At the end of the day it is how the entire package is designed and built, and that I think is what Apple excels at, and I would say they are getting better at it. We'll have to wait until actual product using the Zen 4 solutions to see if the industry has really caught up to AS Macs in terms of power efficiency.
Intel/AMD sells parts for general application hence their iGPUs being so anemic and just there so they can list it on a spec sheet.

Apple's SoC are spec'd to their present & future sole needs. At the time the M1 came out in Nov 2020 its iGPU had the highest performance of any and all iGPUs.

When the M1 Ultra came out it was claimed its iGPU to have the equivalent performance of a RTX 3090 on specific benchmarks.

Some Mac users want RAM to be decoupled but as you pointed out it causes additional overhead like power consumption and even latency. Unlike Unified Memory these SODIMMs are not designed to be shared with all of the other cores of the SoC.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Intel/AMD sells parts for general application hence their iGPUs being so anemic and just there so they can list it on a spec sheet.

Apple's SoC are spec'd to their present & future sole needs. At the time the M1 came out in Nov 2020 its iGPU had the highest performance of any and all iGPUs.

When the M1 Ultra came out it was claimed its iGPU to have the equivalent performance of a RTX 3090 on specific benchmarks.

Some Mac users want RAM to be decoupled but as you pointed out it causes additional overhead like power consumption and even latency. Unlike Unified Memory these SODIMMs are not designed to be shared with all of the other cores of the SoC.
Again, I think the the iGPU classification is also silly. A GPU is a GPU, whether integrated in a die or lives separately, it's main job is to compute. There're no rules that says a separate die GPU must always be more performant than a GPU that's integrated in a SoC.

I believe Apple's engineers knows what they are doing. I don't think Apple is lying when they compared the M1 Ultra GPU to the 3090, but maybe a little cheeky. They all do the same, like AMD's show and tell during the Zen 4 launch.

Sure, there are any different types of Mac users with many different type of demands. Doesn't mean they will get what they want. If the current technology limits what can be done, an engineering decision has to be made. So in this situation, because Apple has a target performance envelope they want to hit, and if that cannot be achieved using the conventional design approach, they will have to pivot. And that's what they did to their Mac product lines.

It'll be interesting see how the Mac Pro turns out.
 

AirpodsNow

macrumors regular
Aug 15, 2017
224
145
The Qualcomm/ARM Android chips that are 5nm/7nm/10nm are superior to any Intel/AMD Windows chip on a basis of performance per watt.

It just isnt given proper marketing, supply chain, distribution and software support as Apple had with their Macs

While we could agree that Qualcomm/ARM might have powerful chips that might be able to power Windows/PC, it is the business model that is missing, which is crucial. Apple Silicon emerged by being subsidised (investments from) the iPhone business, but it needs to be able to stand on its own as a business model, basically the sales of Macs. Since the 'chips' business seems to require a 'binning' system, where you difference in performance for the same chip architecture. Apple seems to have figured out and how to 'sell' and 'advertise' them into their products (Mx Mac & iPad, Pro, Max, Ultra). Also their story of having the SoC with CPU/GPU with unified memory seems to work. There is a lot of 'questions (complaints?)' about this non-changeable RAM. I believe when Apple started to introduce non-replaceable ram in their Macs, it made this transition smoother. And their business model seems to be able to earn a premium on the better binned silicon, which is why I think their business model is great to support it.

However, I do not think that Qualcomm at this moment have a Snapdragon that is so versatile that it can be offered to pc laptop manufacturers with the same type of business model. They might be able to produce a 'chip' that can be as powerful (or more?) as the A or M series, but how are they going to sell it? Will this be premium (priced?) laptops or the basic ones. Why would consumers choose a 'limited' ARM chip above an intel/amd chip? Power efficiency? Are the remaining components to support this silicon as power efficient as well? Also, isn't ARM Windows at the moment a 'lesser' version of the x86 Windows, so why would they choose it? If AMD also needs 'unified memory', does that mean the consumer no longer can upgrade their RAM which this market/consumers seems to be used to it. At the moment it feels very much like the iPad vs android tablets; there is android hardware, but not an attractive business model to support more and better of these products for manufacturers and users.

If someone needs to proof it works, I would think it should be Microsoft with their surface line, but they have quite a bad history with how they have executed their ARM business. So it is interesting to see whether it is Qualcomm being able to crack this code or that intel/amd can innovate in this space. Currently Apple silicon seems to stand for having full (high?) performance on the go and long battery life. Not sure how either of these can be achieved by them at the moment.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
Intel/AMD sells parts for general application hence their iGPUs being so anemic and just there so they can list it on a spec sheet.

Apple's SoC are spec'd to their present & future sole needs. At the time the M1 came out in Nov 2020 its iGPU had the highest performance of any and all iGPUs.

When the M1 Ultra came out it was claimed its iGPU to have the equivalent performance of a RTX 3090 on specific benchmarks.

Some Mac users want RAM to be decoupled but as you pointed out it causes additional overhead like power consumption and even latency. Unlike Unified Memory these SODIMMs are not designed to be shared with all of the other cores of the SoC.

AMD iGPU anemic? The iGPU of AMD that is inside the 2-year old Xbox Series X is more powerful than the M1 Max iGPU. Apple only recently claims to have the most powerful iGPU with the M1 Ultra.

And don't forget, the Xbox Series X with an AMD iGPU costs only $499 while a M1 Ultra costs $4000.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
AMD iGPU anemic? The iGPU of AMD that is inside 2-year old Xbox Series X is more powerful than the M1 Max iGPU. Apple only recently claims to have the most powerful iGPU with the M1 Ultra.

And don't forget, the Xbox Series X with an AMD iGPU costs only $499 while a M1 Ultra costs $4000.
I can edit 4K videos, mix audio and edit RAW images on a Xbox & PS5.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
I can edit 4K videos, mix audio and edit RAW images on a Xbox & PS5.

This "anemic" AMD iGPU inside the Xbox Series X runs Flight Simulator X which brings down super expensive gaming PC to it's knees no problem. It is able to run this game with almost the same visuals as a desktop RTX 3090.

Just admit you don't know what you are talking about calling AMD iGPU as "anemic".

And video editing, mixing audio's, photo editing, you can do that on your smartphone even these days. I got Cubasis 3 for example on my iPhone and it is a full flegded DAW.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
This "anemic" AMD iGPU inside the Xbox Series X runs Flight Simulator X which brings down super expensive gaming PC to it's knees no problem. It is able to run this game with almost the same visuals as a desktop RTX 3090.

Just admit you don't know what you are talking about calling AMD iGPU as "anemic".
I think you miss the point to my reply.

There's no copy of Flight Simulator X that runs on macOS. So I never pushed the narrative that Mac with Apple Silicon is a gaming machine.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
I think you miss the point to my reply.

There's no copy of Flight Simulator X that runs on macOS. So I never pushed the narrative that Mac with Apple Silicon is a gaming machine.

It's not about gaming. It's about power, the actual hardware. The AMD iGPU has a 20% higher TFLOPS than the M1 Max iGPU and there are real world examples (such as Flight Simulator X) that shows that the AMD iGPU is not "anemic" which you are claiming it is.

Just admit you are wrong calling the AMD iGPU's as "anemic".
 
  • Like
Reactions: seek3r

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
It's not about gaming. It's about power, the actual hardware. The AMD iGPU has a 20% higher TFLOPS than the M1 Max iGPU and there are real world examples (such as Flight Simulator X) that shows that the AMD iGPU is not "anemic" which you are claiming it is.

Just admit you are wrong calling the AMD iGPU's as "anemic".
Dude we're not having the same conversation. So I am ending it here so not to waste your time.
 

Zest28

macrumors 68030
Jul 11, 2022
2,581
3,933
Dude we're not having the same conversation. So I am ending it here so not to waste your time.

I'm not having a conversation as I'm speaking facts. You are simply spreading misinformation calling AMD iGPU's as anemic.

And I will stick to the facts no matter what you say.
 
  • Haha
Reactions: sam_dean

leman

macrumors Core
Oct 14, 2008
19,521
19,673
Intel Gracemont are on Intel 7 and will be released some time in Q1 2023. AMD's on 5nm since last year and Apple's on 4nm for the A16 Bionic & sampling chips on 3nm since last year as well.

I mean Intel's E-cores which have been shipping for a while now. You were talking about power efficiency of ARM chips vs x86 at the same node. It would be interesting to see the comparisons of latest chips. Unfortunately, Andrei Furumansu was pretty much the only person who cared about accurate measurements of power consumption and he doesn't do tech journalism anymore.

AMD is now on the same process node as the A15/M2, but I don't think their CPU/SoC architecture can scale like how Apple's SoC can scale the gamut of devices that it is powering.

I'm sure it could (see consoles for example), it's just there is no economic sense for AMD to produce these products. And of course, power consumption would be an issue.

Many have been saying that AMD's latest Zen 4 CPU caught up with AS in power efficiency.

One can say many things if one only looks at biased comparisons and ignores details.

Unlike Unified Memory these SODIMMs are not designed to be shared with all of the other cores of the SoC.

Why not? You can make the memory interface as wide as you want, the only problem is cost. And of course then DIMM connector itself which presents significant challenges and reliability concerns.
 
  • Like
Reactions: sam_dean

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
I'm sure it could (see consoles for example), it's just there is no economic sense for AMD to produce these products. And of course, power consumption would be an issue.
Isn't AMD Instinct MI300 proof that AMD can scale its CPU/APU as much as it wants with chiplet technology?

Many have been saying that AMD's latest Zen 4 CPU caught up with AS in power efficiency.
We have to wait. At the moment, there is only "marketing" benchmarking, no third party benchmarking yet.
 

kasakka

macrumors 68020
Oct 25, 2008
2,389
1,074
Not really. Apple already uses the AS SoC in all their gadgets for years, not just starting with 5nm in 2020.

The issue is that AMD and Intel (and the entire industry) has always thought (as least that's what I think) of power efficiency as an after thought, because consumers accepted this fact, until the AS Macs came out. You can engineer to either extremes or you can try to strike a balance. Engineering is always about trade-offs.

Many have been saying that AMD's latest Zen 4 CPU caught up with AS in power efficiency. But many forgot that the CPU/SoC is just one part of the entire package. If Zen 4 CPU pairs up with SODIMMs, they will be slower and consume more power. Then there's the dGPU to make the graphics good, and yet more power. At the end of the day it is how the entire package is designed and built, and that I think is what Apple excels at, and I would say they are getting better at it. We'll have to wait until actual product using the Zen 4 solutions to see if the industry has really caught up to AS Macs in terms of power efficiency.
I'd add that Apple has the integration benefit. They make the OS, drivers and hardware so they have better chance to make them work well together than AMD designing a chip that can go into various systems, running Linux or Windows.

We are entering a stage where the die shrinks won't be as easily achieved and manufacturers have to focus on things like performance/watt and architecture improvements more to get yearly increases in performance.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
I don’t think anything from AMD can be used to run a device like the AirPod. That’s my point.
AirPods don't use a mini version of the iPhone chip, they use a specially designed chip. What makes you think that AMD can't do that?
 

MBAir2010

macrumors 604
May 30, 2018
6,975
6,354
there
I would type a nice "PRO"  M1 system experience I am enjoying here,
but
I am too overwhelmed with thrill using the Apple Silicon!
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
AirPods don't use a mini version of the iPhone chip, they use a specially designed chip. What makes you think that AMD can't do that?
I’m sure AMD can do it. Any company can with sufficient investment. The fact is Apple did it. Nobody else did. Isn‘t this something that should be celebrated instead of being ridiculed. Do you think Apple can produce something that beats AMD or Intel or Nvidia? I think Apple can. The question is whether Apple wants to.
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
I've observed that entertainment whether it be games, social networks/media, streaming video, etc eat into time that would have been used for

- sleeping earlier and longer
- making whole food plant-based meals that are macro & micro nutritionally complete
- increased physical activity
- mindfullness
- quality in-person time with people within the household and other people that actually mean something to you
- sun exposure to generate Vitamin D & not to the extent of skin tanning
and yet, you are still here, spending time away from all that replying to an social forum..oh the hypocrisy ..
This topic was over and over spoken/written here so many times..
Im glad that im not stuck into a one platform, so i can chose whats the best...for now mac it is..5 years from now...who knows, i hate windows is stuck on its legacy and very hard to move on..
have a good day
 

xraydoc

Contributor
Oct 9, 2005
11,019
5,484
192.168.1.1
While I've not read this entire 7 page thread, I'll simply address the main title of the thread.

The thrill of Apple Silicon for me means I've got the performance of a modern 16+ thread Core i7 desktop PC in a 1.5kg 14" laptop that runs ~15 hours on a battery charge.

That's pretty damn good.

Is it the best performing computer workstation that money can buy at any price? No, but it doesn't need to be, nor was it meant to be.
 

sam_dean

Suspended
Original poster
Sep 9, 2022
1,262
1,091
and yet, you are still here, spending time away from all that replying to an social forum..oh the hypocrisy ..
This topic was over and over spoken/written here so many times..
Im glad that im not stuck into a one platform, so i can chose whats the best...for now mac it is..5 years from now...who knows, i hate windows is stuck on its legacy and very hard to move on..
have a good day
I've done most of it. And addictions are hard to break.

I see this as healthier than Facebook.
 
  • Like
Reactions: George Dawes
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.