Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
If Microsoft bets on ARM, it would launch Windows 12 in time for the first laptops with Qualcomm SoCs.

Windows 12 isn't going to 'bet the farm' on Arm SoCs. Intel's Core 100 ( Meteor Lake) and AMD 8000 laptop APUs both have shifts ( low power E cores and zen5 core cores ) to make scheduler adjustements for also.

The variant that is chasing Chromebooks


probably is not myopically fixated on Qualcomm SoCs either.


The center of gravity of Windows 12 isn't revovling around Qualcomm's SoC. Not even close.


Anyway, I imagine Qualcomm will have paid Microsoft, not only for exclusivity, but also for a timely release of the next Windows.

Windows 12 is not going to be Arm only. There is an x86_64 components to it also. The tail isn't going to wag the dog here. There are a huge number of permutations and combinations that Microsoft has to work through here on the legacy systems they'll also still support. It isn't a 'hurry up and release' context.



It would make sense for Microsoft to unveil its next Window at the Microsoft Build at the end of May.

For the beta release .... yeah. ( e.g., WWDC see's beta release of macOS that doesn't show up for another several months. ) Can't launch a product though on a beta. The beta release of W11 was in June. It didn't formally launch until October.
 

Gnattu

macrumors 65816
Sep 18, 2020
1,105
1,666
Also, if it's slow, then it's slow. Simple. power by watt is just a joke when you seriously need power such as game, 3D, and more.
Can we stop arguing with M2 series now? We now have M3 Max with 12P 4E and 50% faster in multicore performance than M2 Max and it already reversed the competition if your favorite workload is cinebench, and M3 Max ships half a year earlier than Qualcomm Elite X. Don't get me wrong, the Qualcomm Elite X is probably the most interesting thing we have in the CPU industry since Apple M1, but the timing does not aligns with the competition in Qualcomm's slide. As an 2024 CPU, it will face 2024 CPUs, not 2023 (even 2022 M2) CPUs in this slide. Today Qualcomm may claim they are the best in class, next day Apple may say no it is us, then next week Intel may say now its us, then next month AMD may say it's my turn. Everyone is making their best chips and it is called competition.
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,835
1,706
Can we stop arguing with M2 series now? We now have M3 Max with 12P 4E and 50% faster in multicore performance than M2 Max and it already reversed the competition if your favorite workload is cinebench, and M3 Max ships half a year earlier than Qualcomm Elite X. Don't get me wrong, the Qualcomm Elite X is probably the most interesting thing we have in the CPU industry since Apple M1, but the timing does not aligns with the competition in Qualcomm's slide. As an 2024 CPU, it will face 2024 CPUs, not 2023 (even 2022 M2) CPUs in this slide. Today Qualcomm may claim they are the best in class, next day Apple may say no it is us, then next week Intel may say now its us, then next month AMD may say it's my turn. Everyone is making their best chips and it is called competition.
Because Elite X is more closer to M2 then M3. 3nm isn't 4nm and 4nm is another 5nm from TSMC. Totally different.
 

Gnattu

macrumors 65816
Sep 18, 2020
1,105
1,666
Because Elite X is more closer to M2 then M3. 3nm isn't 4nm and 4nm is another 5nm from TSMC. Totally different.
But they are shipping in 2024 right? right?
You just found an excellent excuse for Intel. Intel's 10nm in more closer than TSMC 7nm and not TSMC 5nm, so Intel should compare with 7nm chips shipped one or two years ago and wins most of them instead of the 5nm chips shipping at the same time. From your logic, any comparison using Zen4 to Intel Alderlake is prohibited because Intel uses a subpar process and Alderlake clearly wins 7nm Zen3 by a large margin and Intel wins AMD.
 

AeroHydra

macrumors newbie
Oct 30, 2023
11
13
I'll stress again that we have no idea which market the chip will be targeting. No doubt the competition will be one of the M3 chips but the price will dictate the market. Consumers don't care about whether a chip is 3nm vs 4nm, die size, core count, etc., at least not in a vacuum. When Intel got stuck on 14nm we didn't say that Ryzen comparisons were unfair, since both Intel and Ryzen could be bought by consumers at similar prices. The price of laptops with this chip will ultimately determine whether this is an M3, M3 Pro, or M3 Max competitor.

If this is in $3k+ laptops similar to the M3 Max then I agree, the chip will likely be underwhelming. But if you can get an X Elite in a $1k machine then all of a sudden the chip becomes very competitive. I doubt this is meant to compete with M3 Max, neither in performance nor in price, and that's okay. The vast majority of Macs sold are the ones with the base M1/M2/soon to be M3 chips, because most consumers do not need the power of the Max-level chips.

As for Performance Per Watt: For some people it is very important, for others it is not so much. But I believe that for an ARM chip Performance Per Watt is going to be somewhat more important than it is for x86 since the benefit of ARM usually is lower power consumption and most people buying ARM laptops look for good battery life. Power scaling is also a thing, meaning just because a chip is capable of using a large power draw doesn't mean it has to do that all the time, or that it is the most optimal.

I also believe that Qualcomm does not need to best Apple in every regard for this chip to be a success. There are many people who prefer or need Windows, or desire a feature that other OEMS have that Apple does not- things like the Surface 2 in 1 form factor, touch support, game compatibility, multi-screen laptops, OLED screens, etc., There are also people who begrudgingly moved to Mac due to battery life but still would prefer Windows, and would switch back even if they could get a laptop that had 80% of the efficiency and battery life of a Mac for example.
 
  • Like
Reactions: eltoslightfoot

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
Because it was never meant to compete with M2 Max in terms of GPU. Only CPU does. But I wouldn't worry too much since Qualcomm already outperformed A16 and A17 Pro in terms of GPU performance and X Elite's GPU is an old version which is quite obvious.
it doesnt compare even with M2 Pro regarding gpu...place that into an 14" windows laptop and the scores will be even lower
 
  • Like
  • Haha
Reactions: sunny5 and Homy

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
Because Elite X is more closer to M2 then M3. 3nm isn't 4nm and 4nm is another 5nm from TSMC. Totally different.
M3 is already shipping next week so the gap is even higher...why we should compare Elite X in June 2024 with products that will not even sold anymore? thats desperation .... you are like Apple comparing the M3 Mac with Intel Mac 11x more faster
Lets put this way - in my Maya projects this M3 Max will be 11x faster than an Qualcomm X Elite laptop
 

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,835
1,706
But they are shipping in 2024 right? right?
You just found an excellent excuse for Intel. Intel's 10nm in more closer than TSMC 7nm and not TSMC 5nm, so Intel should compare with 7nm chips shipped one or two years ago and wins most of them instead of the 5nm chips shipping at the same time. From your logic, any comparison using Zen4 to Intel Alderlake is prohibited because Intel uses a subpar process and Alderlake clearly wins 7nm Zen3 by a large margin and Intel wins AMD.
So? The semiconductor technology differences is there which you can not ignore. Beside, only Apple takes advantage from it which you are clearly ignoring it.
 
  • Like
Reactions: eltoslightfoot

sunny5

macrumors 68000
Original poster
Jun 11, 2021
1,835
1,706
it doesnt compare even with M2 Pro regarding gpu...place that into an 14" windows laptop and the scores will be even lower
Tell me, which CPU has great GPU which is comparable to external GPU? None.

M3 is already shipping next week so the gap is even higher...why we should compare Elite X in June 2024 with products that will not even sold anymore? thats desperation .... you are like Apple comparing the M3 Mac with Intel Mac 11x more faster
Lets put this way - in my Maya projects this M3 Max will be 11x faster than an Qualcomm X Elite laptop
So what? 4nm vs 3nm? Beside, Apple is the only one taking advantage with semiconductor technology so I also can say, Apple HAS to use 3nm in order to compete with 5nm which is a joke. People hate to admit that Apple is the only one using 3nm among others which already gives performance advantages.
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
But they are shipping in 2024 right? right?
You just found an excellent excuse for Intel. Intel's 10nm in more closer than TSMC 7nm and not TSMC 5nm, so Intel should compare with 7nm chips shipped one or two years ago and wins most of them instead of the 5nm chips shipping at the same time. From your logic, any comparison using Zen4 to Intel Alderlake is prohibited because Intel uses a subpar process and Alderlake clearly wins 7nm Zen3 by a large margin and Intel wins AMD.
I recommend ignoring sunny5. They tend to ignore all evidence that contradicts their position. For example sunny5 posts that there is no M-series GPU that is comparable to an external (I assume discrete?) GPU but we know that just isn't true in a whole host of benchmarks... gaming benchmarks show the M2 Max hanging with mid range desktop NVIDIA cards and keeping pace on GPU compute with much more power hungry NVIDIA desktop parts. I've posted links in other threads and don't feel like doing so here again because they'll just be ignored again.
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
I recommend ignoring sunny5. They tend to ignore all evidence that contradicts their position. For example sunny5 posts that there is no M-series GPU that is comparable to an external (I assume discrete?) GPU but we know that just isn't true in a whole host of benchmarks... gaming benchmarks show the M2 Max hanging with mid range desktop NVIDIA cards and keeping pace on GPU compute with much more power hungry NVIDIA desktop parts. I've posted links in other threads and don't feel like doing so here again because they'll just be ignored again.
No one is forcing you to post here. I am interested in this thread. I see two main issues. The pro m1-3 side can't admit that there is finally a real challenge here. Let me say it again for the pro apple crowd. This doesn't need to compete with the M3 Max with 128GB of "Unified RAM"--whatever that is. No, we already have i9 4090RTXs for that.

This is simply for the mid range M2 or M2 Pro (Now M3) crowd. I would LOVE to have a Windows thin and light that can get 15 hour battery life and still be plenty powerful.

Now, the pro Snapdragon side can't admit that at the end of the day the M3 line will likely still trounce the Snapdragon line. That is okay. I, for one, can admit that fine. Especially when you add in the GPUs.

I mean, let's think about this. My i7-13700HX with an RTX 4060 8GB GPU (32 GB RAM and 2 TB SSD I upgraded myself) competes with (and way kicks the crap out of in some ways) even an M2 Max. End of story. I paid $999 refurbished for it. It is simply great.

Now, this laptop gets about 4 hours of battery life. Where M3 gets 22 hours. LOL. So it does this performance by brute force not efficiency. Now let's take my Surface Pro 9. It has an i5 and 8 GB RAM (but a 512 GB SSD I could put in there myself). It also gets about 6-7 hours of battery life. This is my pain point and where windows is falling behind. Enter Snapdragon. What if I could now get a laptop/Surface device that has even more power, but no heat and awesome battery life?

This Snapdragon device will never compete with the high-end i7s and i9s. It doesn't need to.
 
  • Like
Reactions: AeroHydra

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
No one is forcing you to post here. I am interested in this thread. I see two main issues. The pro m1-3 side can't admit that there is finally a real challenge here. Let me say it again for the pro apple crowd. This doesn't need to compete with the M3 Max with 128GB of "Unified RAM"--whatever that is. No, we already have i9 4090RTXs for that.

This is simply for the mid range M2 or M2 Pro (Now M3) crowd. I would LOVE to have a Windows thin and light that can get 15 hour battery life and still be plenty powerful.

Now, the pro Snapdragon side can't admit that at the end of the day the M3 line will likely still trounce the Snapdragon line. That is okay. I, for one, can admit that fine. Especially when you add in the GPUs.

I mean, let's think about this. My i7-13700HX with an RTX 4060 8GB GPU (32 GB RAM and 2 TB SSD I upgraded myself) competes with (and way kicks the crap out of in some ways) even an M2 Max. End of story. I paid $999 refurbished for it. It is simply great.

Now, this laptop gets about 4 hours of battery life. Where M3 gets 22 hours. LOL. So it does this performance by brute force not efficiency. Now let's take my Surface Pro 9. It has an i5 and 8 GB RAM (but a 512 GB SSD I could put in there myself). It also gets about 6-7 hours of battery life. This is my pain point and where windows is falling behind. Enter Snapdragon. What if I could now get a laptop/Surface device that has even more power, but no heat and awesome battery life?

This Snapdragon device will never compete with the high-end i7s and i9s. It doesn't need to.
No one is arguing that the Snapdragon won’t allow thin and light Windows laptops.

The post you’re referencing is specifically arguing (against someone arguing in bad faith) about performance metrics though.
 
  • Like
Reactions: bcortens

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
No one is arguing that the Snapdragon won’t allow thin and light Windows laptops.

The post you’re referencing is specifically arguing (against someone arguing in bad faith) about performance metrics though.
Right but both sides are doing it by ignoring the logic I keep presenting about stratification in the Windows market. It isn't Apples to Apples. Which your side of the argument refuses to acknowledge.

I completely concede that when GPUs are included as stated, the M2 Max beats up the Snapdragon. Now can your side of the argument concede that the Windows world isn't only ARM processors, so the Snapdragon doesn't need to compete against the M2/M3 Max to completely dominate? It only needs to hold its own against the M2/M3 and M2/M3 Pro. Which it can.

It is redefining what a Windows thin-and-light can be, and that is all it needs to do.

Edited to add: Do I also need to point out that the M1 started as only the M1--not the M1 Pro and the M1 Max/M1Ultra? For the first several months it was only the M1. So think of this initial foray as only the base M1-M3 but for Windows. It's just the starting point.
 
Last edited:
  • Like
Reactions: AeroHydra

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
I googled a bit and haven't easily seen answers to a couple of questions:
- Are laptop makers going to stick this chip in linux machines?
- Are there any mainstream manufacturers lined up to use these chips (Dell, Lenovo, etc.) with a Linux variant?
 

bcortens

macrumors 65816
Aug 16, 2007
1,324
1,796
Canada
Right but both sides are doing it by ignoring the logic I keep presenting about stratification in the Windows market. It isn't Apples to Apples. Which your side of the argument refuses to acknowledge.

I completely concede that when GPUs are included as stated, the M2 Max beats up the Snapdragon. Now can your side of the argument concede that the Windows world isn't only ARM processors, so the Snapdragon doesn't need to compete against the M2/M3 Max to completely dominate? It only needs to hold its own against the M2/M3 and M2/M3 Pro. Which it can.

It is redefining what a Windows thin-and-light can be, and that is all it needs to do.

Edited to add: Do I also need to point out that the M1 started as only the M1--not the M1 Pro and the M1 Max/M1Ultra? For the first several months it was only the M1. So think of this initial foray as only the base M1-M3 but for Windows. It's just the starting point.

My post that you replied to was about sunny5 who repeatedly posts in these forums arguments that Apple silicon is terrible because it doesn’t match a discrete windows GPU or CPU in some arbitrary benchmark and then ignores when people post counter benchmarks that show how good it is in other benchmarks. That is bad faith argumentation. I wasn’t making any specific claim about this new Qualcomm chip.
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
Tell me, which CPU has great GPU which is comparable to external GPU? None.


So what? 4nm vs 3nm? Beside, Apple is the only one taking advantage with semiconductor technology so I also can say, Apple HAS to use 3nm in order to compete with 5nm which is a joke. People hate to admit that Apple is the only one using 3nm among others which already gives performance advantages.
I dont know if any OEM will have external GPU included in the box... . And to buy an laptop and an good enough external gpu...better buy a desktop pc with a monitor. The whole point of a laptop that costs less than Apple and its portable enough comes to an end
Again, there are tenths of external gpus that will be less performant than M3 Max...so for you to say "none" shows how tech savy you are , or at least that you dont know even the windows world and the external gpu world. An mediocre external gpu doesnt even goes full power on an 36Gbs/s thtough that TB3/4,, they need TB5...soare you joking? i really hope you are trolling
But i guess almost everybody here knows that you just make funny comments and you bring absolutely nothing to the table
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
I googled a bit and haven't easily seen answers to a couple of questions:
- Are laptop makers going to stick this chip in linux machines?
- Are there any mainstream manufacturers lined up to use these chips (Dell, Lenovo, etc.) with a Linux variant?
since some of these "Scores" are done within linux system, i hope some will come in linux machines....otherwise you can install Linux since it was build natively for M family and for this SoC, otherwise they couldnt reach these good numbers
 
  • Like
Reactions: eltoslightfoot

AeroHydra

macrumors newbie
Oct 30, 2023
11
13
I googled a bit and haven't easily seen answers to a couple of questions:
- Are laptop makers going to stick this chip in linux machines?
- Are there any mainstream manufacturers lined up to use these chips (Dell, Lenovo, etc.) with a Linux variant?
I don't think we have definitive answers on this but I believe that like other traditional ARM chips you can install any Linux ARM distro on their chips. Apple M-series are more proprietary and so they don't allow for traditional Linux distros to be installed, hence the need for the Asahi Linux team to essentially start from scratch.

I dont know if any OEM will have external GPU included in the box... . And to buy an laptop and an good enough external gpu...better buy a desktop pc with a monitor. The whole point of a laptop that costs less than Apple and its portable enough comes to an end
Again, there are tenths of external gpus that will be less performant than M3 Max...so for you to say "none" shows how tech savy you are , or at least that you dont know even the windows world and the external gpu world. An mediocre external gpu doesnt even goes full power on an 36Gbs/s thtough that TB3/4,, they need TB5...soare you joking? i really hope you are trolling
I think OP meant a discrete GPU (not external). You're right that there are definitely dGPUs with worse performance than M3 Max. But some users have made a good point that for price-to-performance Nvidia dGPUs in laptops are definitely a lot better than Apple Silicon especially for 3D rendering and gaming. For context the 4090 absolutely beats the M2 Max and I guarantee will also beat the M3 Max in these categories, and 4090 laptops are similarly priced if not cheaper than Max Macbooks. Are there compromises in terms of efficiency? Absolutely.

Like I said, there are both advantages and disadvantages for going full integrated ala Apple vs designing their chip for an optional dGPU like I believe Qualcomm is doing. By not building a fully integrated SoC Qualcomm gives OEMs more freedom to choose and price laptops accordingly (aka having a cheaper non-GPU focused machine for business/professional laptops) and also offloads the burden of having to support and update software drivers to Nvidia who has done exactly that for decades vs trying to do it themselves. Even Apple has trouble with extracting maximal performance in things like games, Machine Learning, or Blender even with much tighter vertical integration than Qualcomm will.
 
  • Like
Reactions: eltoslightfoot

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
I don't think we have definitive answers on this but I believe that like other traditional ARM chips you can install any Linux ARM distro on their chips. Apple M-series are more proprietary and so they don't allow for traditional Linux distros to be installed, hence the need for the Asahi Linux team to essentially start from scratch.


I think OP meant a discrete GPU (not external). You're right that there are definitely dGPUs with worse performance than M3 Max. But some users have made a good point that for price-to-performance Nvidia dGPUs in laptops are definitely a lot better than Apple Silicon especially for 3D rendering and gaming. For context the 4090 absolutely beats the M2 Max and I guarantee will also beat the M3 Max in these categories, and 4090 laptops are similarly priced if not cheaper than Max Macbooks. Are there compromises in terms of efficiency? Absolutely.

Like I said, there are both advantages and disadvantages for going full integrated ala Apple vs designing their chip for an optional dGPU like I believe Qualcomm is doing. By not building a fully integrated SoC Qualcomm gives OEMs more freedom to choose and price laptops accordingly (aka having a cheaper non-GPU focused machine for business/professional laptops) and also offloads the burden of having to support and update software drivers to Nvidia who has done exactly that for decades vs trying to do it themselves. Even Apple has trouble with extracting maximal performance in things like games, Machine Learning, or Blender even with much tighter vertical integration than Qualcomm will.
4090 should never be compared to the Max...maybe to the Ultra...even Apple did that..come on
Again, if you need CUDA you go with windnows x86+ nvidia gpu....not with this Qualcomm, if you need ray tracing...well how many good dGpu or external gpu have hardware ray tracing+ mesh shading that can beat M3 Max? I think not even him knows what hes talking about. External gpu have a more "open space" for improvements but they need TB5...and since we are talking about qualcomm windows on arm...i dont think we will have dGpu but just support for external gpu (am i right? or Qualcomm said otherwise?)
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
Even Apple has trouble with extracting maximal performance in things like games, Machine Learning, or Blender even with much tighter vertical integration than Qualcomm will.
Exactly....exactly my point and i totally agree...Qualcomm will fail even harder on this
And having ray tracing and mesh shading with that 128gb unifed memory....i bet not of those childish Qualcomm X elite laptops will come even close for my Maya projects
 

MayaUser

macrumors 68040
Nov 22, 2021
3,177
7,196
Even Apple has trouble with extracting maximal performance in things like games
Here Apple demonstrated that it comes to developers to do that...Baldurus Gate 3 is an example that games can work beautiful but unfortunately Apple needs to pay billions to developers to do that since developers knows this mac platform is not profitable so its up to Apple to invest some billions IMO
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
4090 should never be compared to the Max...maybe to the Ultra...even Apple did that..come on
Again, if you need CUDA you go with windnows x86+ nvidia gpu....not with this Qualcomm, if you need ray tracing...well how many good dGpu or external gpu have hardware ray tracing+ mesh shading that can beat M3 Max? I think not even him knows what hes talking about. External gpu have a more "open space" for improvements but they need TB5...and since we are talking about qualcomm windows on arm...i dont think we will have dGpu but just support for external gpu (am i right? or Qualcomm said otherwise?)
We can go lower if you want. 3080Ti? 3090? Probably even down to the 3060Ti or so. They are all better. Of course they use like 10x the power to do it, but they are better for thousands cheaper. It's the price Apple has to pay to only have the one type of chip.
Exactly....exactly my point and i totally agree...Qualcomm will fail even harder on this
And having ray tracing and mesh shading with that 128gb unifed memory....i bet not of those childish Qualcomm X elite laptops will come even close for my Maya projects
Yes, but Qualcomm doesn't have to even do that part--hence the Nuvia. They will leverage Nvidia (even if it is just onboard level not discrete). Nvidia will do all that part.
Here Apple demonstrated that it comes to developers to do that...Baldurus Gate 3 is an example that games can work beautiful but unfortunately Apple needs to pay billions to developers to do that since developers knows this mac platform is not profitable so its up to Apple to invest some billions IMO
And we all know how that is going to go. They will point to several games that work--like BG3, and all the game dev companies will continue to use Anti-Cheat technology that necessitates Windows only (and maybe Proton). And Apple will go on not really caring. Ray Tracing was never really the issue either way. Nvidia and AMD have had Ray Tracing for how long?

I wish it was different and I didn't need a gaming PC, but I always will, so......
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
Right but both sides are doing it by ignoring the logic I keep presenting about stratification in the Windows market. It isn't Apples to Apples. Which your side of the argument refuses to acknowledge.

I completely concede that when GPUs are included as stated, the M2 Max beats up the Snapdragon. Now can your side of the argument concede that the Windows world isn't only ARM processors, so the Snapdragon doesn't need to compete against the M2/M3 Max to completely dominate? It only needs to hold its own against the M2/M3 and M2/M3 Pro. Which it can.

It is redefining what a Windows thin-and-light can be, and that is all it needs to do.

Edited to add: Do I also need to point out that the M1 started as only the M1--not the M1 Pro and the M1 Max/M1Ultra? For the first several months it was only the M1. So think of this initial foray as only the base M1-M3 but for Windows. It's just the starting point.
I don’t know why you’re arguing when there doesn’t seem to be any pushback to your main point of contention, that the snapdragon will allow for better thin and light Windows laptops with better battery life.

That’s frankly a given.

What this thread was started for specifically was to compare performance between the M series and the new snapdragon. So to say that they shouldn’t be compared kinda goes against the point of this thread.

Likewise, Qualcomm directly compared their processor to Apple Silicon. So they are aiming to take on Apple.
 

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,545
3,092
I don’t know why you’re arguing when there doesn’t seem to be any pushback to your main point of contention, that the snapdragon will allow for better thin and light Windows laptops with better battery life.

That’s frankly a given.

What this thread was started for specifically was to compare performance between the M series and the new snapdragon. So to say that they shouldn’t be compared kinda goes against the point of this thread.

Likewise, Qualcomm directly compared their processor to Apple Silicon. So they are aiming to take on Apple.
Apple Silicon I have no problem with. It’s just that people want to only focus on the particular apple silicon that matches their side of the argument. Again, can we only compare this Snapdragon to just the base M2 or M3 first? Then we will move up. 😂 I think it might be instructive.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
I don't think we have definitive answers on this but I believe that like other traditional ARM chips you can install any Linux ARM distro on their chips. Apple M-series are more proprietary and so they don't allow for traditional Linux distros to be installed, hence the need for the Asahi Linux team to essentially start from scratch.
Did they start from scratch? I was under the assumption that there was ARM instructions that worked normally and they’re just having to reverse engineer the specific Aarch instructions.
I think OP meant a discrete GPU (not external). You're right that there are definitely dGPUs with worse performance than M3 Max. But some users have made a good point that for price-to-performance Nvidia dGPUs in laptops are definitely a lot better than Apple Silicon especially for 3D rendering and gaming. For context the 4090 absolutely beats the M2 Max and I guarantee will also beat the M3 Max in these categories, and 4090 laptops are similarly priced if not cheaper than Max Macbooks. Are there compromises in terms of efficiency? Absolutely.
There’s other compromises too in features and quality I’d argue. Likewise they cannot sustain their speeds unplugged which limits their usefulness imo.

Looking at the situation, I think Apple is closing the gap. The current companies are constantly increasing power consumption to stay ahead and there’s going to be a point where that’s no longer a viable option.
Support and update software drivers to Nvidia who has done exactly that for decades vs trying to do it themselves.
This is something which I have concerns for. If NVidia is also entering the market as they’ve stated they will. What would prevent them from simply not working on drivers for cpus outside their own? (and x86 of course)

They’ve shown that they’re very much interested in creating their own ecosystem. And being the dominant player they have substantial leverage over the industry.
Even Apple has trouble with extracting maximal performance in things like games, Machine Learning, or Blender even with much tighter vertical integration than Qualcomm will.
That’s not to say that it would be equally as difficult or easier if they weren’t tightly integrated though.

It’s an interesting time in any case, and I honestly hope there’s a lot of success in the ARM space on non-Apple devices. But in my opinion, Qualcomm has one hell of an uphill battle.

Getting OEM support for their chips is going to be tough. Apple did it because they forced the issue, but there’s no guarantee that other OEMs will be on board, look at AMD with Zen for example. They had difficulty getting OEMs to use their cpus initially even with the same architecture.
And even if the OEMs offer Snapdragon powered machines, there’s no guarantee end users will buy them, and they will be sold side by side with x86 products.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.