Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Icelus

macrumors 6502
Nov 3, 2018
421
574
Qualcomm Snapdragon X Elite Powered ASUS Vivobook S15 Laptop Seeing Linux Patches
For those interested in laptops powered by Qualcomm's Snapdragon X Elite SoC, it's looking like the ASUS Vivobook S15 model could be one of the first devices with decent Linux support. There are patches undergoing review for upstreaming the ASUS Vivobook S 15 DeviceTree support so that much of the basic functionality is working under Linux but various features are known to be broken.
 
Last edited:

komuh

macrumors regular
May 13, 2023
126
113
It looks like a game changer in the PC world. An Intel laptop consumes 1W/h, while a Qualcomm laptop consumes 0.7W in 12 hours.
If i understood correctly Qualcomm and new "Intel" are trying to work around bad Windows standby by using new version of it and some hardware tricks? (Intel 5 125H seems to be using 2% of battery on 9h standby in other tests videos so maybe this test is outliner or just older intel/laptop manufacturer didn't fully use modern standby version?)
 

varezhka

macrumors member
Jun 10, 2022
73
55
It looks like a game changer in the PC world. An Intel laptop consumes 1W/h, while a Qualcomm laptop consumes 0.7W in 12 hours.

Wow, that really does change everything. And probably worth the reduced software compatibility of ARM at least for my work usage.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Against AMD’s Phoenix and Intel’s Meteor Lake iGPUs, Adreno X1 has competitive compute throughput for basic 32-bit and 16-bit floating point math operations. The Snapdragon X Elite has better DRAM bandwidth than the competition, thanks to a very fast LPDDR5X controller. Qualcomm also deserves credit for flexibly using their GMEM block as local memory, render cache, or a tiled rendering buffer depending on what the situation calls for.
But Adreno X1’s cache bandwidth is low and latency is mediocre. Register file capacity isn’t high enough considering Adreno’s very wide wave sizes. GMEM flexibility is great, but Adreno still feels like a GPU optimized for the DirectX 11 era where pixel shader work dominates. On the compute side, performance with 64-bit integers is poor, and FP64 support is absent. Drivers and supporting software are in rough shape.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
Against AMD’s Phoenix and Intel’s Meteor Lake iGPUs, Adreno X1 has competitive compute throughput for basic 32-bit and 16-bit floating point math operations.

High throughput doesn't matter much if you can't actually leverage it. I was always wondering why Adreno performs so well on simple graphics benchmarks and so badly in real-world software (especially compute), and these deep dives explain this very well. Adreno is built to excell at mobile-level graphics with large triangles and simple shaders. It is not a general-purpose GPU architecture. What's more, if Qualcomm wants to improve the ability of their GPUs on more complex workloads, they will need to make decisions that reduce the peak compute — or invest much more die area for the GPU. There is no magic bullet here.
 

Chuckeee

macrumors 68040
Aug 18, 2023
3,060
8,722
Southern California
High throughput doesn't matter much if you can't actually leverage it. I was always wondering why Adreno performs so well on simple graphics benchmarks and so badly in real-world software (especially compute), and these deep dives explain this very well. Adreno is built to excell at mobile-level graphics with large triangles and simple shaders. It is not a general-purpose GPU architecture. What's more, if Qualcomm wants to improve the ability of their GPUs on more complex workloads, they will need to make decisions that reduce the peak compute — or invest much more die area for the GPU. There is no magic bullet here.
So are you saying the Adreno GPU are optimized for benchmarks?
 

crazy dave

macrumors 65816
Sep 9, 2010
1,450
1,221
So are you saying the Adreno GPU are optimized for benchmarks?
In fairness, they are a bit better than that, but only a bit. Architecturally, the Adreno GPU should perform decently on many games (well those the driver doesn't crash or put in too many artifacts) and indeed chipsandcheese (and other reviewers) have shown that it does do so in practice too (again, for games the drivers don't choke on). But yes, once you need to do anything complicated or compute-oriented the GPU's hardware becomes, to put it kindly, suboptimal and the driver/software situation just gets even worse. Basically the Adreno GPU is everything naysayers (incorrectly) claimed about Apple Silicon GPUs when Apple first made the transition to the M1. The Adreno GPU is very mobile games focused and is struggling to expand its capabilities beyond that. Intel's and especially Qualcomm's struggles with drivers and software stand in stark contrast with how comparatively smoothly Apple's efforts to enlarge its GPUs went and highlight what a monumental undertaking that must have been internally - while Apple had a smaller library of applications and games to worry about and Apple's software/drivers are far from perfect (no software that complicated will be anywhere close), still the fact that the transition to bigger GPUs that could handle the greater demands of a PC-workload went as well as it did shows how carefully Apple planned it.

Future generations of hardware, drivers, and software may improve things for Adreno, but Qualcomm has its work cut out for itself here, much more than the CPU-side. If Qualcomm's GPU (and other SOC accelerators beyond the NPU) are as subpar as they seem to be, then AMD and Intel remain the more compelling Windows/Linux option by offering "good enough" CPU performance plus default software compatibility and as good if not much better accelerators. With the rise of SOCs, it's about the whole package, pun intended.

A tangent for a small amusing history note: the Adreno GPU traces its design lineage back to ATI/AMD who licensed and then sold their mobile handheld graphics division to Qualcomm in 2009. That was a long, long time ago obviously so I doubt it has much practical relevance to anything today. It's just interesting to note. I bet if someone were to deep dive on modern Adreno chips they'd probably still find "Imageon" IP blocks - similarly for Apple CPUs and GPUs you can find the ghosts of IP-past if you dig into them (stuff derived from PA Semi and PowerVR and even where those companies had gotten their IP).
 
Last edited:

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Qualcomm-based notebooks seem to have been well received.
A fifth of computers sold during launch week were AI PCs, according to data provided by market researcher Circana.
Avi Greengart, an industry analyst at Techsponential who helped host a Qualcomm promotional event, said in an interview that battery life rather than AI is the main selling point of the laptops at this point.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
No way I'm picking one of this atm
For those interested in a Snapdragon X-based laptop on Linux, a Phoronix user has written up his first impression of his new Dell XPS 13 Snapdragon X.
First impression is that the Dell XPS is very nicely designed. It's quite small, compared to my Macbook M2 Pro. A bit too small I think. Today mostly wrestled with getting a fresh Windows install without any Dell bloat. Got that working. Coming weeks going to experiment with Linux. Things I noticed: It can get quite loud and also a bit warm. My Macbook Pro M2 for comparison feels always cool and does not have any noise. So that is definitely a negative. Also, top row of keys is capacitive part of the upper body. Who designs these things?! No physical escape key is really bad, especially when developing and just general running around your OS. Really crappy. But I do like that it's a small nicely designed machine. So overal after half a day, I am quite positive. But yeah, it's not my main driver, and the loudness and temperature compared to a Macbook would put me off using it for main dev work. I just can't go back to a real jet engine next to me. But for a general daily tinker thing, it's amazing.

Time will tell if these laptops live up to expectations. I hope more and more people will write their impressions about Snapdragon X-based notebooks.
 

Confused-User

macrumors 6502a
Oct 14, 2014
850
984
For those interested in a Snapdragon X-based laptop on Linux, a Phoronix user has written up his first impression of his new Dell XPS 13 Snapdragon X.


Time will tell if these laptops live up to expectations. I hope more and more people will write their impressions about Snapdragon X-based notebooks.
I'm finding it all very disappointing.

I knew that from a tech perspective, the SXE was not even in the same ballpark as Apple's M3/M4 chips. But I had hopes that they would be close enough to give Apple a taste of competition for the MBA's niche. That doesn't appear to be happening. (They might give the M3 Pro a little bit of a push, due to 6 P instead of E cores in the SXE, but I doubt it, while the Max is utterly unassailable. M4s will be a blowout. But maaaaybe there will be just the tiniest bit of pricing pressure on the MBPs, if QC can restrain their greed. It looks like that might be happening.)

Where are the fanless designs? Just how bad is the SXE when run at low enough power to run fanless? It shouldn't be all that horrible, based on the early benchmarks we saw.

Fortunately, so far Apple seems to be pushing itself, regardless of the lack of real competition. But I don't think that can last forever. Sooner or later, they could succumb to the Intel Disease. Or even in a better scenario, they could let their attention drift to other silicon (AI, radio, etc.) because their CPU position is so far ahead of the field. So for that reason alone I hope QC can do better next time around... and that someone does the best that can be done with the current SXE in a fanless laptop.

Of course this says nothing about the Adreno GPU. With no indication that they're stepping up their game there, that appears hopeless in the short term.

With the SXE ripe for demolition, I think the chance that Apple will rebuild bootcamp is going up a bit. In one sense it's a distraction, but on the other hand, it's a way for Apple to take a giant victory lap, as bootcamp M3/M4s will be the best Windows laptops in the world by a large margin, by many metrics (assuming WoA continues to improve).
 
  • Like
Reactions: Homy

Confused-User

macrumors 6502a
Oct 14, 2014
850
984
Oops... in case "6 P instead of E cores in the SXE" wasn't clear: The M3 is 6P + 6E, the SXE is 12P. So by that one metric the difference between the two is that 6 cores are P instead of E.
 

varezhka

macrumors member
Jun 10, 2022
73
55
For those interested in a Snapdragon X-based laptop on Linux, a Phoronix user has written up his first impression of his new Dell XPS 13 Snapdragon X.


Time will tell if these laptops live up to expectations. I hope more and more people will write their impressions about Snapdragon X-based notebooks.

That's unfortunate, though not too surprising coming from a Dell XPS. That line has always been a bit lacking in terms of chassis thermals. I've seen enough coworkers burned (both literally and figuratively) by the previous iterations.
 
  • Like
Reactions: MiniApple

crazy dave

macrumors 65816
Sep 9, 2010
1,450
1,221
I'm finding it all very disappointing.

I knew that from a tech perspective, the SXE was not even in the same ballpark as Apple's M3/M4 chips. But I had hopes that they would be close enough to give Apple a taste of competition for the MBA's niche. That doesn't appear to be happening. (They might give the M3 Pro a little bit of a push, due to 6 P instead of E cores in the SXE, but I doubt it, while the Max is utterly unassailable. M4s will be a blowout. But maaaaybe there will be just the tiniest bit of pricing pressure on the MBPs, if QC can restrain their greed. It looks like that might be happening.)

Where are the fanless designs? Just how bad is the SXE when run at low enough power to run fanless? It shouldn't be all that horrible, based on the early benchmarks we saw.

Fortunately, so far Apple seems to be pushing itself, regardless of the lack of real competition. But I don't think that can last forever. Sooner or later, they could succumb to the Intel Disease. Or even in a better scenario, they could let their attention drift to other silicon (AI, radio, etc.) because their CPU position is so far ahead of the field. So for that reason alone I hope QC can do better next time around... and that someone does the best that can be done with the current SXE in a fanless laptop.

Of course this says nothing about the Adreno GPU. With no indication that they're stepping up their game there, that appears hopeless in the short term.
Here's to hoping version two is better next year, but yeah it is a little disappointing. It's important to remember that their primary competitors are Intel and AMD and their CPU architecture looks good, very good even depending on how upcoming Ryzen/Lake chips fare in the wild. It's just that they need that advantage or even a bigger one just to convince people to try them rather than x86. But the CPU is not the part that worries me. The rest of the SOC is a drag (okay the NPU is fine if they ever write Windows drivers for it) rather than a boon and the support/surrounding software is dismal. So yes, lots of things need improvement just to compete against their fellow Windows machines running x86 never mind Apple.
With the SXE ripe for demolition, I think the chance that Apple will rebuild bootcamp is going up a bit. In one sense it's a distraction, but on the other hand, it's a way for Apple to take a giant victory lap, as bootcamp M3/M4s will be the best Windows laptops in the world by a large margin, by many metrics (assuming WoA continues to improve).
With respect to bootcamp ... I dunno I don't think the chances go up very much. All the issues are still there. Apple and MS would have to cooperate to make the necessary changes to the Windows kernel and write all the Windows drivers for the various accelerators, especially DirectX which if you've been following Alyssa's blog wouldn't necessarily be easy - it can be done, but not trivial (DirectX makes assumptions that Apple GPUs just don't follow - you can work around them, but you do have to do work arounds, though to be fair every GPU requires some level of that, Apple GPUs would simply require more). None of that is necessarily a dealbreaker, though we've seen how difficult drivers are with Windows on ARM for Qualcomm and that was years of close cooperation. The real problem is the one Qualcomm and MS are facing in the video that @Xiao_Xi posted: support. Apple and MS would have to take responsibility for support for when things inevitably go wrong or have bugs or need optimizations and that's the most expensive long term part.

Add to that, Apple may not see the benefit of such a deal. The only thing they'd get as an immediate benefit is potentially quashing Qualcomm's PC aspirations, but as much as Apple doesn't like Qualcomm very much, they aren't a direct competitor the way the MS Windows and Google Android ecosystems are. Further it wouldn't do much for quashing Qualcomm's smartphone chips. From Apple's perspective right now they have the best hardware available and long term want as many applications and games to be as native as possible (or at least with GPTK not physically leave macOS, sort of with VMs). Bootcamp could, in theory, slow things down if they are being successful in convincing devs to port. However, I could see the calculations changing if their own native/porting efforts stall completely though. Given the amount of effort and resources it would take, Apple especially would really have to want it and MS would have to show a level of competence they haven't so far. Not impossible, still not likely though.
 
Last edited:
  • Like
Reactions: Chuckeee

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
It seems that notebook manufacturers are not very happy with Microsoft and Qualcomm.


I only got a 1/2 way through ( I skipped through subsections of second half to see if any change in theme. didn't find one). this but the characterization above is off.

This video spends most of the time talking to a game developer ; not a notebook manufacturer. Next up in terms of 'moaning and groaning' time is on things that are wrong with Windows (11) that is primarily completely independent from x86 vs Arm. The quirk here is that there is also a gross grounding in context in games. ( not too surprising this guy also shows up as a guest on "Moore's law is dead". ) .

The numerous references that "Windows RT" are lacking context. Nuvia's cores were not Qualcomm's until they bought them. Trying to spin this as 'version 3' is a big leap. The Arm CPU cores are new. (the "Snapdragon" prefix name is being , but the 'X Elite' , 'X Plus" suffixes are entirely new. ) . It is kind of hard to ship a "early development kit' on the new CPU cores when there are no previous versions of those cores. (Nuvia's cores were not even done when Qualcomm bought them.)

The only 'old-ish' on a linear path from Windows RT would be perhaps be Qualcomm's GPU. I think he is grossly missing the plot there. Attacking Nvidia in the WinPC space was not the top of the priority list. When Intel tried to attract Windows games to its dGPUs there was lots of driver messy issues. Pulling game devs out of the old comfort zone of legacy iGPU they half paid attention to and away from Nvida/AMD dGPUs and all the of the assumptions they stuffed into their apps. Sure; stuff breaks. Really shouldn't be all that surprising.


Windows 11 start menu moaning ... not anything substantive to do with 'Arm'. Windows 11 ads ... nope. Marketing Stickers on laptops ... nope. So the 'and Qualcomm' is a bit of a stretch for most of the material here.

Qualcomm doesn't have the wrap-around, close hand-holding system vendor infrastructure that Intel has. Neither does AMD . Apple's ramp on being more 'game app' friendly is years long. He is cherry picking how rapidly Apple deploy their software/drive stack on the 'new' Arm/custom GPU direction.
 
  • Like
Reactions: MiniApple

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
With the SXE ripe for demolition, I think the chance that Apple will rebuild bootcamp is going up a bit. In one sense it's a distraction, but on the other hand, it's a way for Apple to take a giant victory lap, as bootcamp M3/M4s will be the best Windows laptops in the world by a large margin, by many metrics (assuming WoA continues to improve).

That is likely exactly causaly backwards. The larger the 'speed' advantage Apple has the more acceptable Windows in a VM will be. The probability for Bootcamp would more likely go down; not up.

Apple isn't going drag in UEFI and Pluton spec compliant abilities. Again that goes 'backwards' and 'down' if Apple has an advantage to not spending time, effort, and money on those things and end up with a substantive performance/security advantage.

Delusional if think Apple is jumping out of bed in the morning with the primary objective of being the best Windows laptops. Windows on Arm isn't going to lead to a 'Wintel' situation ( Where Windows is 90% just one CPU vendor). The primary point of WoA is so that do not end up in a single vendor monopoly situation on the hardware side.
 

crazy dave

macrumors 65816
Sep 9, 2010
1,450
1,221
I only got a 1/2 way through ( I skipped through subsections of second half to see if any change in theme. didn't find one). this but the characterization above is off.

This video spends most of the time talking to a game developer ; not a notebook manufacturer. Next up in terms of 'moaning and groaning' time is on things that are wrong with Windows (11) that is primarily completely independent from x86 vs Arm. The quirk here is that there is also a gross grounding in context in games. ( not too surprising this guy also shows up as a guest on "Moore's law is dead". ) .

The numerous references that "Windows RT" are lacking context. Nuvia's cores were not Qualcomm's until they bought them. Trying to spin this as 'version 3' is a big leap. The Arm CPU cores are new. (the "Snapdragon" prefix name is being , but the 'X Elite' , 'X Plus" suffixes are entirely new. ) . It is kind of hard to ship a "early development kit' on the new CPU cores when there are no previous versions of those cores. (Nuvia's cores were not even done when Qualcomm bought them.)

The only 'old-ish' on a linear path from Windows RT would be perhaps be Qualcomm's GPU. I think he is grossly missing the plot there. Attacking Nvidia in the WinPC space was not the top of the priority list. When Intel tried to attract Windows games to its dGPUs there was lots of driver messy issues. Pulling game devs out of the old comfort zone of legacy iGPU they half paid attention to and away from Nvida/AMD dGPUs and all the of the assumptions they stuffed into their apps. Sure; stuff breaks. Really shouldn't be all that surprising.


Windows 11 start menu moaning ... not anything substantive to do with 'Arm'. Windows 11 ads ... nope. Marketing Stickers on laptops ... nope. So the 'and Qualcomm' is a bit of a stretch for most of the material here.

Qualcomm doesn't have the wrap-around, close hand-holding system vendor infrastructure that Intel has. Neither does AMD . Apple's ramp on being more 'game app' friendly is years long. He is cherry picking how rapidly Apple deploy their software/drive stack on the 'new' Arm/custom GPU direction.
I agree with some of your criticisms, but I will also say the video makes some good points as well: remarking that this is their third go around and dev kit problems are to do with everything around the new CPU cores rather than the CPU cores themselves. You do acknowledge that to some extent, but that's an important point. Windows on ARM, the software stack, the drivers - that should've all been building off previous iterations and they did to some extent, but just not very well. And they still haven't released a dev kit that is actually a dev kit for the platform. Yes Intel also had issues when it scaled up its iGPU to dGPUs, but that's another factor they address, the lack of a definable support structure with Qualcomm/Windows to address problems. You mention Intel as doing well here, but in the video they also say AMD and Nvidia do too. In a year Qualcomm and Windows on ARM may be in a much better place and hopefully their sakes they will be, but that there is a lot of lackluster execution here on everything around the CPU is worth noting.

That said, yes a lot of the criticism in the video is actually leveled at MS and have little to with ARM per se. Though that's largely the point of the video: moving to ARM doesn't magically fix all the other problems Windows has. It just gives Windows ARM machines a better power efficiency relative to the current crop of x86 machines, but all the other poor decisions are still there. And that's really what he's harping on. The issues MS is having with getting this particular transition right, despite years and years of trying to get Windows on ARM to be a thing including many years with Qualcomm as a partner, is, to him, emblematic of MS' struggles in general.

I would also add that I think he undersells the Qualcomm CPU cores, sure they aren't maybe quite as good as they could've been and they're definitely later than they should've been (although given how unprepared MS/Qualcomm seem to have been for the launch anyway, maybe that was for the best), but they are still quite good cores. His repetition of them being underwhelming is I think a disservice though I have my own issues with how Qualcomm approached the overall CPU design, especially for the primary device target, which I laid out in previous posts.
 
Last edited:

crazy dave

macrumors 65816
Sep 9, 2010
1,450
1,221
I re-visualized the Cinebench R24 Notebookcheck data into bubble chart form:

Screenshot 2024-07-13 at 11.16.45 AM.png


Inside the bubbles for the single thread efficiency I list the points per watt (the size of the bubble corresponds to the points/watt value) and for multi-thread efficiency inside the bubbles (or around it when it got too crowded) I have both the score and the points per watt as it gets a little more tricky to track and I have a point about that later. Here are the standout observations that I see:

1) The Qualcomm core doesn't quite match Avalanche's (the M2 P-core) performance/efficiency in CB R24.

2) The Qualcomm Elite 78 is in an Asus laptop while the 64 and 80 are from Microsoft Copilot (tablet hybrid I believe). My working hypothesis is that that Asus has much worse power delivery efficiency to the chip under load. It *should* be a better binned silicon than the 64, but it is obviously worse than both 64 and the 80 which either achieve better ST performance at the same watt or lower watt at same ST performance. Unfortunately, in MT Notebook check only had performance curves for the Asus, but again we can see that the 80 and 64 are superior (the 64 is technically slightly worse, but given that it has 2 fewer cores it should be much, much worse, and isn't - like 5% less efficient at roughly the same power/performance). This is where having software measurement of core power to see its estimates relative to the hardware measurement would've been really beneficial.

3) The Ryzen 7 does not come of well here at all. The Intel chip, not pictured, is worse, but in single core the AMD Ryzen pulls almost as much wattage as the entire 64/78/80. It is nearly 3x less efficient than the M2 Pro and 2-2.6x less efficient than the Oryons for much less ST performance. In other words if they tried to boost it even further to match the performance, its efficiency would get even worse. In multicore its best showing is around the 56W mark where it closes the efficiency gap, but once again when tried to actually match the performance of the Snapdragon at that wattage it has to draw over 80W and still doesn't manage it. And as I said, the 80-class chip would've been even better, it achieves at 39.6W nearly the same level of performance (within 2%) as the AMD at 82.6W, again, almost 2x the efficiency at that performance level. This is why I wanted to emphasize the score along with the efficiency in the multi-thread test. Having said all that at the 56W the AMD processor gets close (within 12-20% efficiency) to the Snapdragon/M2 Pro and I suspect this where in the 30-60W range the AMD chip is best suited, a particularly inefficient implementation of the Snapdragon chip by an OEM and particularly good implementation of the AMD chip (the AMD is German Schenker VIA I don't know its reputation) and yeah they could absolutely line up. Also not clear what the Snapdragon perf/W curve looks like below 35W, if it steepens (likely) it could naturally match the AMD processor here. But as bad as the perf/W of the Asus Oryon is relative to the MS Oryon here, its perf/W curve is still clearly above AMD's curve for the tested values.

4) The two MS Snapdragons kinda support that the 12-core Snapdragon chip is hamstrung. At roughly the same power with two more cores the 80-class is only able to muster ~10% greater efficiency than the 64-class processor. That's not *bad*, but I *think* that really should be better, closer to 20%.

Caveats: Qualcomm's CB R24 scores relative to Apple don't look as good as one would think they should and in GB 6's short ray tracing subtest the Oryon cores improve significantly relative to Apple's M2 Pro and come out closer to what one would expect from the design. However, CB R24 is even worse for the AMD processor and in GB6 it catches up a little (in performance) to both Nuvia and Apple (power untested, but probably still bad, especially in single core). I might try to recapitulate the Geekbench ISO graphs @leman and I created awhile back, but include one of these Snapdragons. Thus, CB R24 may represent a "worst case scenario" for the AMD processor here or perhaps more accurately a best case for Apple (a remarkable turn around from CB R23 which was technically Apple Silicon native but very obviously unoptimized, which might be the case for Qualcomm too). You can see this in the comparisons between single threaded GB 6.2 and CB R24 - multithreaded gets more complicated and again we lack power measurements for the various processors for GB 6.2.
 
Last edited:
  • Like
Reactions: Confused-User
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.