Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
nd again my premise is that just because A CPU uses less energy and runs at a lower clock speed, does not simply meant it uses less energy to accomplish the same task. Which also implies that running the same task on the same CPU (like low energy CPUs in an Apple laptop) with reduced clock does not save the world. It only saves the battery.
A CPU using less power can accomplish the same task as another CPU with less energy, equal energy or more energy...

Your example has a hugely flawed assumption that CPUs get the same amount of work done per unit of energy.
 
  • Like
Reactions: mjs916
ARM based chips going mainstream in Windows is good for Mac gaming. The x86 to Arm emulation layer in game porting toolkit/Crossover can be eliminated.
 
Surface laptops have Apple levels of build quality, no bloatware or garbage that many other manufacturers add... and if they can get the battery performance up there with the MacBook Air... I think its a solid option.

Most people are familiar with Windows... and it's more versatile from a software compatibility aspect.

If I had to pick up just one of my devices and told I could only use that device... it would probably be my surface laptop go 2... it can do everything... its a jack of all trades.
Thanks, that's good to know. I'm a college instructor, and about half of my students have Windows laptops. For my data analysis and portfolio analysis courses, the software we use works on both platforms, but a bit differently. Sometimes I can't figure things out when my Windows students get stuck on something. So I've been thinking about getting a Windows machine and learning how to do everything on Windows that I do in macOS, so I can better support my students with Windows computers.

Based on your recommendation, I'll take a close look at the Surface laptop.
 
  • Like
Reactions: mjs916
Thanks, that's good to know. I'm a college instructor, and about half of my students have Windows laptops. For my data analysis and portfolio analysis courses, the software we use works on both platforms, but a bit differently. Sometimes I can't figure things out when my Windows students get stuck on something. So I've been thinking about getting a Windows machine and learning how to do everything on Windows that I do in macOS, so I can better support my students with Windows computers.

Based on your recommendation, I'll take a close look at the Surface laptop.
They’re definitely not bad, and compared to some of the low end junk out there they have a good build quality to them.

I have 3 executives at my company running on the Surface Pro 9 (I believe?) and they love them.

Note that’s the tablet, not the full fledged laptop.

These are people who do mostly presentations and spreadsheet type work, so next refresh I may consider piloting an ARM version with one of them to compare.
 
  • Like
Reactions: roncron
In the end, I still wonder how well they will port Windows so it take full advantage of the CPU registers on the Snapdragon X Elite SoC. If they can make the Windows API work closely with the CPU registers, then we could potentially see a big leap forward in Windows performance.
 
Let me to the math for you. 1 sec * 200 joule / sec (remember that 1 watt = 1 joule /sec) = 200 joules. That is energy, it is not a rate at all. Yep my original post used the wrong units for the result of watts which you correctly pointed out as wrong and should have been joules. Mark that up to too much other stuff going on.

Seconds example: 2 sec * 100 joule / sec (remember that 1 watt = 1 joule /sec) = 200 joules.

Both tasks consumed the same amount of energy from the world. None was saved. No rate was used. And again my premise is that just because A CPU uses less energy and runs at a lower clock speed, does not simply meant it uses less energy to accomplish the same task. Which also implies that running the same task on the same CPU (like low energy CPUs in an Apple laptop) with reduced clock does not save the world. It only saves the battery.

Now you might say but we were talking about different CPUs (Apple ARM vs Snapdragon Arm). You cannot compare different instruction sets using overall CPU energy consumption because the instruction set effects both the task duration and energy consumption. Most of this talk about saving energy is marketing BS for a lot of cases. Simple uses that are not using the full CPU power are not saving anything substantial. If all a person does is email, then they are not a saving the world with Apple devices even though Apple marketing makes them feel that way.
Your math is, as I pointed out earlier, imaginary, because it could only be true for imaginary machines. It wouldn't even be true if you used two identical Intel CPUs! That's because you fall victim to the power/efficiency curve - running an intel chip at 2x the power doesn't even come close to getting you 2x the speed. Of course, you pay a certain amount of power to run the laptop no matter what speed (screen, RAM, SSD, IO, etc.). So an Intel laptop running at 45W might get a job done in less than or more than 2x the time the SAME laptop would take running at 90W. You don't know. (But chances are, if would be less than 2x the time.)

In the case of the Macbooks, it's not even close. Running a job on a Macbook Air Mx will may take more or less time than a modern Intel laptop of otherwise equivalent specs (RAM/screen brightness/etc.), depending on the chip and power settings, but it will ALWAYS take less total energy. It *will* conserve joules, a LOT.

That applies even to the person just running email, since Intel systems even idle at higher power than MacBooks, though the difference is probably not that great (I don't recall offhand). But it definitely applies to anyone doing tasks that consume 100% CPU.
 
Let me to the math for you. 1 sec * 200 joule / sec (remember that 1 watt = 1 joule /sec) = 200 joules. That is energy, it is not a rate at all. Yep my original post used the wrong units for the result of watts which you correctly pointed out as wrong and should have been joules. Mark that up to too much other stuff going on.

Seconds example: 2 sec * 100 joule / sec (remember that 1 watt = 1 joule /sec) = 200 joules.
You wrongly assume that power / performance graph is linear.
 
  • Like
Reactions: ric22
QC just officially "introduced" the SXE (now you know why it's not the "Snapdragon Elite X"!). They're already backing off their recent claims about performance a bit. The M3, nevermind the M4, will hold a comfortable lead over the SXE. Even the M2 may.

But there's bigger news. QC is accused of lying about the benchmarks. There are not enough details yet to understand this clearly, but the idea appears to be that Windows on Arm is a disaster. It will be very interesting to see how this plays out! It will also be interesting to see if the chip runs closer to the claimed numbers under Linux. Early benches showed meaningful improvements there. See https://www.semiaccurate.com/2024/0...g-on-their-snapdragon-x-elite-pro-benchmarks/.
 
  • Like
Reactions: rmn1644
The two-core boost suggests “actual” single-threaded performance is even lower than what’s been benchmarked.
Correct, expect a ~2.5% drop from numbers published for 4.3GHz. It's not clear what multicore numbers will look like since they will be heavily dependent on power/heat budgets.

I'm very curious to know what SemiAccurate is talking about.
 
Very informative thread, thanks to the many posters.

ARM on Windows is coming and will be a big thing.

Didn't Apple have a few growing pains optimizing the M1 when it first came out? Wasn't there some update that gave a nice performance boost? So maybe WinARM will need some optimization and then we can really compare.

I know this thread is very Qualcomm focused. Is it the opinion that QC will own the WinARM (vs WinTel) chipset, or will they have a lot of competition? NVIDIA comes to mind as they must have an insane war chest of cash. Will be interesting to watch this settle out over then next 5 years.
 
Didn't Apple have a few growing pains optimizing the M1 when it first came out? Wasn't there some update that gave a nice performance boost?

Early macOS on ARM versions had several memory leaks, although that arguably doesn't directly affect performance.

Is it the opinion that QC will own the WinARM (vs WinTel) chipset, or will they have a lot of competition? NVIDIA comes to mind as they must have an insane war chest of cash. Will be interesting to watch this settle out over then next 5 years.

NVIDIA's interest in SoCs seems to come and go.
 
  • Like
Reactions: Confused-User
[...]Didn't Apple have a few growing pains optimizing the M1 when it first came out? Wasn't there some update that gave a nice performance boost? So maybe WinARM will need some optimization and then we can really compare.

I know this thread is very Qualcomm focused. Is it the opinion that QC will own the WinARM (vs WinTel) chipset, or will they have a lot of competition? NVIDIA comes to mind as they must have an insane war chest of cash. Will be interesting to watch this settle out over then next 5 years.
To the best of my recollection, there were no significant performance issues with M1 that required software optimizations after release.

WoA will *definitely* be needing it though. The reason Apple didn't need to do that much work is that iOS and MacOS are in most ways the same, so they had well over a decade of experience with their OS family running on ARM even before they released the M1.

As for QC, things aren't looking good in terms of bragging rights - the M4 will, holy cow, actually be shipping in products before the SXE. Even I didn't expect that, and I've been pretty optimistic about the Mx chips.

Bragging rights aren't the same as market dominance, though. Will QC rule? Hard to say, and even harder to know what fraction of the PC laptop market ARM will take from x86. But you know what could give them a terribly hard time? Bootcamp on Apple Silicon. Will Apple do that? No idea. I don't see a lot of down side though. In fact, the arguments in favor of it are pretty similar to the arguments in favor of x86 bootcamp 17 years ago (or whenever that shipped for the first time).
 
A bunch of QC SXE-based laptops were announced today.

Pricing is a little better than I expected. Dell's premium XPS 13 is priced the same as an MBA 13 with the same RAM and SSD ($1500). It's competitive in hardware features, at a first glance, and in some ways superior - 4 USB4 ports instead of 2, for example. However other vendors (ASUS for example) are making a stronger value play.

Performance is still not fully characterized, but so far everything is playing out as expected. It's crushed by the M3 in single-core, and competitive or superior to the M3 in multicore, sometimes by a lot, depending on the benchmark, with "embarrassingly parallel" workloads like cinebench giving it the strongest lead. Obviously the M4 changes the picture, closing up some/all of the MC gap while pulling further ahead in SC - but you can't get the M4 in a laptop yet. The M3 Pro is a different story - MC loads are even or at least much closer. BUT - and it's a big but - M3 Pro laptops cost more.

So this is all so far working out as I said it would. The big open questions are:
- how well will these work with legacy (x86) software?
- Is there anything to the rumors of systemic WoA problems causing major performance issues?
- how much better will they work on Linux?

And, of course, what if anything will Apple do in response? I still think we're going to see more M4s at WWDC. And I strongly doubt we'll see 3-P-core M4 anywhere in the laptop lineup, then or in the future.

I think QC and MS are playing their hands very well so far, given the failure of the SXE to even approach Apple in single-core performance. Pricing is reasonable, which is the single biggest thing they could have screwed up.
 
  • Like
Reactions: rmn1644
What is the news here. Was anyone expecting Microsoft to make an announcement claiming the Mac is still better? This is Microsoft we are talking about.
There's several things that qualify as news.

First, the pricing. QC seems to be going all-in on buying market share, which is (I think) smart strategy, but was not at all a sure bet as they've priced previous chips aimed at windows way too high.

Dell getting into the game is a big deal; it (along with a few other vendors of varying levels of importance) means that a lot of the PC crown think that SXE is going to have a significant impact on corporate markets. That's a big vote of confidence, also not a sure thing ahead of time.

Finally, my take on SXE performance (not that I'm the only one who thought this way) has been further validated, though we *still* don't have independent numbers. Most interestingly, it's still beaten badly in single-core by the M3 and even worse by the M4. It's *utterly crushed* in the web-browsing benchmarks used by MS' chosen "competitive analysis" vendor, though that will probably improve a little (not a lot) as Google figures out WoA a little more.

I don't think Apple will feel a ton of pressure from this, yet, but hopefully they'll feel some. I think what they're doing in silicon is great and this won't have any impact there, but perhaps it'll push forward slightly Apple's plans to increase RAM/SSD in base models, and/or decrease the prices of additional tiers. I doubt it'll happen soon, but I think it will happen sooner than it would have otherwise, assuming these things have some success in the market... which is looking likely.
 
There's several things that qualify as news.

First, the pricing. QC seems to be going all-in on buying market share, which is (I think) smart strategy, but was not at all a sure bet as they've priced previous chips aimed at windows way too high.

Dell getting into the game is a big deal; it (along with a few other vendors of varying levels of importance) means that a lot of the PC crown think that SXE is going to have a significant impact on corporate markets. That's a big vote of confidence, also not a sure thing ahead of time.

Finally, my take on SXE performance (not that I'm the only one who thought this way) has been further validated, though we *still* don't have independent numbers. Most interestingly, it's still beaten badly in single-core by the M3 and even worse by the M4. It's *utterly crushed* in the web-browsing benchmarks used by MS' chosen "competitive analysis" vendor, though that will probably improve a little (not a lot) as Google figures out WoA a little more.

I don't think Apple will feel a ton of pressure from this, yet, but hopefully they'll feel some. I think what they're doing in silicon is great and this won't have any impact there, but perhaps it'll push forward slightly Apple's plans to increase RAM/SSD in base models, and/or decrease the prices of additional tiers. I doubt it'll happen soon, but I think it will happen sooner than it would have otherwise, assuming these things have some success in the market... which is looking likely.
I was commenting specifically on the article headline
 
Reviews are decidedly mixed. It's really quite interesting, though not at all surprising - many of the reviews are intensely biased, in both directions.

For example, one reviewer (at tomsguide.com) claims that the new chip "toasts" the M3. This is based entirely on MC results, mostly a GB6 result that's ~18% faster than the base M3, and a handbrake encode that's ~7% faster. He prints, but doesn't much discuss, the fact that the M3 SC result is ~22% faster than the Elite, which is much more relevant for most people.

On the flip side, I've also seen reviews that completely panned the chip (and the containing laptops) because of compatibility issues with some games, which is I think foolish, or based on GPU performance, which is somewhat more reasonable (and something that I think will remain an open question for a while).

I'm happy to say that I called this exactly right. QC really embarrassed themselves with their claims to beat Apple, overclocked their chips at bit for their early demos, and failed to deliver... but not by that much. They can't compete with the M3, much less the M4, in a like-for-like comparison, on either performance or power, even taking into account process differences.

HOWEVER... in terms of market relevance, that means very little. On price, they *are* competing with the base M3 (in the air or base Pro 14"). And looked at that way, they are quite reasonable, and a real option for the relatively few people who are OS-agnostic. They seem to be an excellent buy for Windows buyers not doing serious gaming, though that's of less interest to me (and presumably to a Mac-focused group like this forum).

Apple may not care enough to do this, but they should redo Boot Camp, at least for their M4 MBA and MBP when they ship. The base M4s slaughter the Oryon on singe-core, and will comfortably beat it on most multicore (though some extremely parallel codes like Handbrake and Cinebench might still give a small advantage to the Oryon... time will tell). If the WoA market takes off, Apple will be strongly competitive in the mid-range and completely own the top end. Nothing could come even close to competing with the M4 Pro, much less Max, assuming relative positioning in the M4 line is similar to M3.

I'm going to make a follow-up prediction now. QC will NOT close up the single-core gap with the M3 in their next generation. They won't have a clear win against the M3 even in the THIRD generation, though I hope they achieve parity, at least roughly. M5 will ship before their second generation, and it will slaughter QC's high-end chip on SC and beat it on MC even with the base version. BUT... QC will do well over the next year because, despite their lame attempts to look better than Apple, what they really need to do is be better than Intel and AMD, and they probably will be. New Intel and AMD chips will be improvements over their current offerings but won't close up the gap entirely. The generation after that... will be interesting.
 
  • Like
Reactions: Chuckeee
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.