Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Google still has not made an Google Chrome ARM app for windows yet but has for macOS.

That is not as deep a technical or market issue as you are making it out to be. What is technically missing is Google's profile manager and non core chromium browser features. To an extremely large extent "Chrome" is there on ARM64.

First, Microsoft's Edge browser is based on Chromium. So the core browser is there is being actively worked on.
Indeed on this thread there are nightly ARM64 builds being done on ARM.

https://www.reddit.com/r/surfaceprox/comments/tckerm

What you are trying to poke at is Google holding back their specific, relatively narrow proprietary bits from Arm. That really isn't "a major effect" but more likely a negotiating tactic ( or protest) with a little of market segmentation thrown in for ChromeOS on Arm.

Second, Windows 11 defaults to Edge now which largely acts like Chrome with a slightly different theme on top. same extensions. Microsoft has gotten rid of the IE 'distraction'. The ARM systems going forward are all going to be Win11 so lots of folks are going to be tracked/herded into Edge.


Firefox has been on Arm64 since 2019


So Edge and Firefox are ARM64. If Google wants to shoot themselves in the foot and loose market share they can sit on the sidelines as long as they want. Google Chrome has a very dominate market share. Not quite Windows:Mac ratio dominate, but very high. That means Google can afford to be lazy.

On macOS Apple herds folks into Safari. On the macOS Chrome browser dominance is already significantly lower. Open a bigger door to Safari and they'd lose larger chunks faster. (especially hyper Apple fans who have Safari on their phone/iPad/Mac. )

Micorsoft has run a much more muddled competitive path. Default to IE but Edge was an option. Then Edge default but IE still hanging around. Not finally tossing iE in new OS images going forward (kind of). Micorosoft is having to pay the freight for the heavily lifting of moving the Chromium core forward on Windows on Arm. Google is just looking and watching for now. After the major work is done, jump in and reap the major benefits (Google has more pressing ecosystem to compete in).



Apple's move to ARM is something that Qualcomm's CEO said had a major effect on the industry taking ARM seriously.

If Qualcomm had continued to delivered rebadged smartphone chips they wouldn't have. At least in the Windows space, they wouldn't. Qualcomm CEO is trying to get on the hype train. Hype train and folks actually taking stuff seriously is two different things. Qualcomm paid $1.4B (billion) for a chip company that never shipped a single chip. Didn't even finish designing one completely. CEO needs that hype train to keep "analysts" from thinking too long and too hard on that. It may pay off, but also may not. He needs "analysts" and stockholders focused only on the upside.




Microsofts Dev tools like Visual Studio did not ARM native till this year in 2022. MS waited 5 years to port VS to ARM. Really bad outlook.

They had 5 years to dump 32-bit too and "should have" done that earlier also. As long as Visual Studio was riding on dated 32-bit infrastructure it really didn't make much sense to waste money on porting to ARM on Windows. Apps were being ported just fine. 32-bit VS would just be yet another 32-bit boat anchor coupled to Windows on Arm and it has too many of that legacy bloat on it now. [ one reason why all the focus was on 32bit intel emulation. That whole maximum backwards focus pour molasses all over this "next gen " Windows effort. ]


I could tell that the first WoA devices were utter trash compared to x86 counterparts. Apple did a way better launch than 2017 Microsoft ARM devices with their M1. Being first does not mean anything, it's who does it best.

"best" is a clever way of moving the goal posts. The post I was responding to was implying that Microsoft was going to do Windows on Arm. Depends upon whose metrics of "best". For Windows there are a significant number of folks who moan when their 32-bit , 10 year Word macro doesn't work right. Microsoft put lots of effort into making that work right on the 2017 version of Window 10 on Arm. Apple threw all 32-bit code in the trash can. Along with 'raw iron' booting other operating systems. Putting all kernel extensions drivers on 'deprecated' notice.

Because Windows carries around a larger group of backward looking folks the leads times to progress are going to very likely take longer. It isn't a measure of "best" if not engaged in the same activity. Pele is better than Walter Payton as a footballer is not really a good "best" comparison.




That SoC is not good when compared to Intel 12th gen, AMD's Zen 3+ and M2. Qualcomm chips won't be good until Nuvia's chips come out from them.

Depends upon where Qualcomm prices it at and which system it is put in. Priced along the i3-i5 and priority on battery lifetime ( as opposed to hot rodding through tech porn benchmarks) it could do well. Yeah can pay more and soak up more battery power ... but a battery dead laptop is how fast when unplugged ?

[ Qualcomm techincally will price them in the context of a CPU+Modem bill of materials bundle. That is another part of the issue where it is an 'Apples to Oranges" as to what Apple is doing. Apple's Mac solutions are not connected celluar radio wise at all.]


Do they have "desktop replacement" SoCs ? No. Is there any good reason for Qualcomm to build one of those? Nope (especially as long as they an active celluar radio modem component to them. )



Apple still has to add many things to their chips. More cores and RT and AV1, WiFi 6E etc.. Apple chips are also not perfect.

Apple also spent a Billion on something that has shipped a whole lot of nothing in the interim. They have much bigger "missing features" than AV1 and ray tracing.
 
  • Like
Reactions: Gerdi

Gerdi

macrumors 6502
Apr 25, 2020
449
301
That SoC is not good when compared to Intel 12th gen, AMD's Zen 3+ and M2. Qualcomm chips won't be good until Nuvia's chips come out from them.

Excuse my ignorance, but I have yet to see a 9W chip for fan-less devices from 12 gen Intel or Zen 3+ series, which outperforms an 8cx gen 3.
In fact 11th gen Intel CPUs like Lakefield, which went into fanless laptops, do not even have half the performance of the 8cx gen 3.
 
  • Like
Reactions: pshufd

exoticSpice

Suspended
Jan 9, 2022
1,242
1,952
Excuse my ignorance, but I have yet to see a 9W chip for fan-less devices from 12 gen Intel or Zen 3+ series, which outperforms an 8cx gen 3.
agree unlike M2 which under low power mode uses only 7.5watts max and is still faster than 8cx gen3. AMD and Intel don't have anything that can compete in the fanless 9W area.
 
  • Like
Reactions: pshufd

exoticSpice

Suspended
Jan 9, 2022
1,242
1,952
Apple threw all 32-bit code in the trash can. Along with 'raw iron' booting other operating systems. Putting all kernel extensions drivers on 'deprecated' notice.
Because ARMv9 deletes 32bit code...


Apple was preparing for the future of ARM.

You can boot native Linux now. Still not ready but perfect for non-GPU tasks for now.
 

grizzlified

macrumors newbie
Oct 4, 2021
13
6
Philippines
I don't get why people care so much about benchmarks. I mean end of the day you can have the beefiest processor in the world but if the "lower scoring" one still feels and runs better at everyday task then what is the point? Its not like everyone who buys their compute equipment actually needs it to be 1 zillion single score performing but eating 1 billion watts per second you know.
 
  • Like
Reactions: Colstan

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
You can boot native Linux now. Still not ready but perfect for non-GPU tasks for now.
I think I’d say it’s perfect for “non-Nvidia” tasks for now. Well, and that it’ll never be good a Nvidia tasks, to be clear. :) It’s got a capable GPU, but if one gives it Nvidia type instructions, they’re not going to get good performance.
 

chkay

Suspended
May 27, 2022
79
177
People might find this hard to accept, but Apple Silicon absolutely will be destroyed by Intel and AMD eventually. They're playing catchup but once they get to a 3 or 4nm things will change.

M1 obviously blew everything out of the water when it came out, but the rather small improvements on M2 made it pretty clear that there will be a ceiling to annual improvements that Apple can make. Not because they aren't capable, but because of Moore's Law - every chip has been falling further and further behind the expected 2x annual increase.

The big hurdles the competition has are all power draw related, which is a solveable thing, and once solved will level the playing field and we'll be back to boring minor and fairly insignificant performance increases each year.
 

R!TTER

macrumors member
Jun 7, 2022
58
44
I mean end of the day you can have the beefiest processor in the world but if the "lower scoring" one still feels and runs better at everyday task then what is the point?
For feels you have PCIe 5.0 SSDs, not coming anytime soon to a Mac near you.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
People might find this hard to accept, but Apple Silicon absolutely will be destroyed by Intel and AMD eventually. They're playing catchup but once they get to a 3 or 4nm things will change.

M1 obviously blew everything out of the water when it came out, but the rather small improvements on M2 made it pretty clear that there will be a ceiling to annual improvements that Apple can make. Not because they aren't capable, but because of Moore's Law - every chip has been falling further and further behind the expected 2x annual increase.

The big hurdles the competition has are all power draw related, which is a solveable thing, and once solved will level the playing field and we'll be back to boring minor and fairly insignificant performance increases each year.

Except there is nothing preventing Apple from relaxing their strict efficiency requirements. Current Apple Silicon models are around 3x-4x more efficient at peak performance compared to x86 CPUs and that’s a huge pit for Intel and AMD to close. Intel in particular gets ahead in raw performance by massively increasing power consumption on performance cores as well using many throughout-oriented cores, while AMD is simply using many cores that allows it to play with different points on the efficiency curve. They would both need radically new microarchitectures to even hope to compete with Apple in per-core efficiency, which is easier said than done when you have less talent and resources than Apple.

But what happens if Apple decides to build a core that prioritizes performance over efficiency instead of the current “maximize power for a predefined power target” approach? Right now their cores peak at 5-6W. They can easily afford to go up to 10-12W for their prosumer laptops and even higher for desktops. And they don’t care whether their chips are competitively priced, they can afford spending two or even three times more per chip than Intel does and they will still save money over buying x86 hardware on the market. Apple will always be able to afford more cache, more memory controllers, etc. etc.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Where are you getting those 3-4x more efficient than x86 numbers from?

I am strictly talking about efficiency at peak performance. M1 P-cores have maximal performance at 5W, mobile Zen is between 10W and 15W depending on the model and Intel is 20-25W (they can draw more but only for very short periods of time). At the same time peak performance of M1 P-core is better than Zen3 and slightly lower than Intel.

At lower wattages the numbers are obviously different since lowering the power improves efficiency. Zen3+ for example should be within 20% of M2 at 10W since it has twice as many performance cores and can dramatically underclock them during multicore operation, reaching an advantageous point at the efficiency curve.
 

R!TTER

macrumors member
Jun 7, 2022
58
44
I am strictly talking about efficiency at peak performance. M1 P-cores have maximal performance at 5W, mobile Zen is between 10W and 15W depending on the model and Intel is 20-25W (they can draw more but only for very short periods of time). At the same time peak performance of M1 P-core is better than Zen3 and slightly lower than Intel.

At lower wattages the numbers are obviously different since lowering the power improves efficiency. Zen3+ for example should be within 20% of M2 at 10W since it has twice as many performance cores and can dramatically underclock them during multicore operation, reaching an advantageous point at the efficiency curve.
I assume you have actual (benchmark) numbers to prove this? AMD or Intel could just as well artificially limit their TDP to much lower levels & get higher efficiency in multiple tasks, not to mention the M1 still has the more superior & efficient LPDDRx memory. Just as an example you can also do this on GPU's to make them more efficient -
 

GuruZac

macrumors 68040
Sep 9, 2015
3,748
11,733
⛰️🏕️🏔️
What makes you think that? Intel has already bought alot of capacity at TSMC. Furthermore, it has been reported that Apple, NVIDIA and AMD want to cancel their orders at TSMC due to lack of demand. So if Intel wants even more capacity at TSMC, they can buy more.

Intel at 3nm produced by TSMC is coming, which would solve alot of problems for Intel.
For sure, but Apple will still have the advantage of having the entire integration of hardware and software, and those optimizations cannot be underestimated.
 

Gerdi

macrumors 6502
Apr 25, 2020
449
301
At lower wattages the numbers are obviously different since lowering the power improves efficiency. Zen3+ for example should be within 20% of M2 at 10W since it has twice as many performance cores and can dramatically underclock them during multicore operation, reaching an advantageous point at the efficiency curve.

This does not make much sense. We have seen that the 6800U is losing 45% performance when going from 30W to 12W and it is probably closer to 50% when going below 10W. This would make it already 25% slower than the 8CX Gen 3. So how come the difference is only 20% compared to M2?

ps. I hope you do not use Cinebench as reference for efficiency. ARM CPUs have to do significantly more work, when running Cinebench compared to x64 CPUs.
 
Last edited:

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
Apple clearly won the PR game with the M1, both in the Apple and PC world. It cast a big shadow over Intel and AMD. Even Intel's CEO praised them. The arrival of the M2, on the other hand, has been very underwhelming. People in the media and tech enthusiasts aren't gushing about Apple Silicon like they used to. People in the PC world aren't respecting the M2 like the M1. Apple has left themselves vulnerable and the upcoming Intel and AMD chips will force people to reevaluate Apple's chips.

Apple better be scared as nothing can stop Intel winning every other game. Just look at Intel crush those quarterly earnings!

Never mind.

I have more shock than awe for Intel right now. If they wrongly think consumers only see computers with long battery life and efficient processors as a "nice-to-have" compared to raw performance wins... wait until the cost-conscious server market kicks them to the curb as well.

In the name of competition, I hope Intel is in as good a position as the OP claims.
 

Abazigal

Contributor
Jul 18, 2011
20,392
23,893
Singapore
Apple better be scared as nothing can stop Intel winning every other game. Just look at Intel crush those quarterly earnings!

Never mind.

I have more shock than awe for Intel right now. If they wrongly think consumers only see computers with long battery life and efficient processors as a "nice-to-have" compared to raw performance wins... wait until the cost-conscious server market kicks them to the curb as well.

In the name of competition, I hope Intel is in as good a position as the OP claims.

I don’t think Intel feels that way. They only pretend to overlook it because they know they can’t match Apple when it comes to performance per watt. Intel will find that technically winning on raw performance benchmarks will amount to little in the greater scheme of things.
 
  • Like
Reactions: eltoslightfoot

leman

macrumors Core
Oct 14, 2008
19,521
19,677
This does not make much sense. We have seen that the 6800U is losing 45% performance when going from 30W to 12W and it is probably closer to 50% when going below 10W. This would make it already 25% slower than the 8CX Gen 3. So how come the difference is only 20% compared to M2?


ps. I hope you do not use Cinebench as reference for efficiency. ARM CPUs have to do significantly more work, when running Cinebench compared to x64 CPUs.

Well, CB is unfortunately the only results available to us. It represents the best possible case for x86 and worst possible case for Apple, so any comparisons using CB will severely underestimate Apple Silicon, but it’s still useful as a proxy. In particular, it can show us how the performance changes with power consumption.

So what I’ve written in my post are ballpark estimates. I’d expect a 4+4-core M2 to be at least 20% faster across a range of sustained workloads than an 8-core Zen3+ running at the same limit 10W (note that Zen3 only uses 1.25W per core here while Apple uses over 2W, giving Zen a more advantageous placement at the efficiency curve). A hypothetical 8P Apple Silicon at 10W would probably be at least 40% faster than Zen3+. And a 4-core Zen3+ would probably be just over half the speed of 4+4 Apple Silicon since AMD would need to boost the cores and lose efficiency.

It’s really difficult to reason about this stuff since power/performance is not linear. I’m just trying to make the best guess using the curves we know about (they are all of fairly similar exponential shape anyway)
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
small improvements on M2 made it pretty clear that there will be a ceiling to annual improvements that Apple can make. Not because they aren't capable, but because of Moore's Law - every chip has been falling further and further behind the expected 2x annual increase.
Hennessy believes that the future of hardware is domain-specific architecture, and Apple has a head start over AMD/Intel.
 
  • Like
Reactions: eltoslightfoot

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
I don't get why people care so much about benchmarks. I mean end of the day you can have the beefiest processor in the world but if the "lower scoring" one still feels and runs better at everyday task then what is the point? Its not like everyone who buys their compute equipment actually needs it to be 1 zillion single score performing but eating 1 billion watts per second you know.

I care about benchmarks because I want to know how something will run my workload. I find that Geekbench 5 provides an accurate reflection of my workload performance (assuming enough RAM, fast storage and enough display channels).
 
  • Like
Reactions: eltoslightfoot

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Hennessy believes that the future of hardware is domain-specific architecture, and Apple has a head start over AMD/Intel.

Do they though? I mean, they have a slight head start in low-power inference hardware, but that's about it. One might mention Apple's AMX, but Intel has had AVX512 filling that niche for a while and they are releasing their own matrix accelerators (albeit for limited precision only) with the upcoming Xeon series. And I know from a little bird that AMD is working on integrating a vector processor, although not quite clear when it will happen. At any rate AMD has plenty of experience with vector stuff, being a GPU company and all.
 

pshufd

macrumors G4
Oct 24, 2013
10,149
14,574
New Hampshire
I don’t think Intel feels that way. They only pretend to overlook it because they know they can’t match Apple when it comes to performance per watt. Intel will find that technically winning on raw performance benchmarks will amount to little in the greater scheme of things.

Did you see Intel "earnings" yesterday afternoon? I'd guess that Pat didn't like letting the bad news out.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.