Considering the market stratification for which this would apply, I agree with your assessment. Your average 13th gen i9 with an RTX 4090 already rocks past the M2 Max...So again, this is to compete with lesser beings where M1 and M2 have had the run to themselves. Now they will hopefully have some actual competition.This is in 8-10 months before even M3 and M4. Sure, it supports dGPU but not until 2025: "There is little probability of such configurations at least in the first wave of devices in 2024".
Apple Silicon is indeed in danger...
View attachment 2304466
This is in 8-10 months before even M3 and M4. Sure, it supports dGPU but not until 2025: "There is little probability of such configurations at least in the first wave of devices in 2024".
Apple Silicon is indeed in danger...
There is about zero 'danger' for Apple Silicon
If Microsoft bets on ARM, it would launch Windows 12 in time for the first laptops with Qualcomm SoCs. Anyway, I imagine Qualcomm will have paid Microsoft, not only for exclusivity, but also for a timely release of the next Windows.If Windows 12 comes 3 years after W11 then that would put 12 in the Q4 2024 timeframe. That would be quite bad for Elite X. Even June 2024 is in the 'not good' category.
Some systems likely will ship with W11 and some that ship Q4 in 2024 might ship with W12 . So it is fungible as to what version of Windows might get
I'm sure they rushed to release this before the M3 shows up next week and beats it before it's even in a product.
Qualcomm chip is indeed great.
Still more than 3 times slower than M2 Max.
View attachment 2304511
Using more power but slower than both M2 and M2 Max.
View attachment 2304512
Still about 3 times slower than M2 Max despite the game is running through AGPTK or Crossover (not clear)
View attachment 2304513 View attachment 2304514
Yes, when anyone else underperforms it’s by design. Very good.Because it was never meant to compete with M2 Max in terms of GPU. Only CPU does.
What is exactly the reason you being on this forum? From the short time i’ve been here I only see you spreading misinformation or negative stuff anything Apple related. It’s good to be critical and Apple can surely do better in some cases but what your doing is something differentBecause it was never meant to compete with M2 Max in terms of GPU. Only CPU does.
…that they could only use approved software onBtw, all results were ACTUALLY tested with actual devices. Multiple people confirmed this.
I see, so absolute performance is what matters?To sum up:
Single core: Qualcomm is better even at 23W. At 80W, it's the highest.
Multi core: Not bad but despite having 12 performance cores and 80W, not that great. But if you ignore the power consumption, Qualcomm has higher multi core performance.
Say what?GPU: Quite great. Beside, it beats M2 Max in terms of power by watt.
Bringing info is my fault? Wow.What is exactly the reason you being on this forum? From the short time i’ve been here I only see you spreading misinformation or negative stuff anything Apple related. It’s good to be critical and Apple can surely do better in some cases but what your doing is something different
Anyways I do hope this Qc chip finds it way to the consumers soon, seem for Windows a nice ARM alternative and maybe Microsoft and other vendors will now make more ARM stuff.
Lol, you really need to get your narrative straight. When Apple does it, it's underwhelming. When Qualcomm uses pretty much the same core as M2, it's great.
Eh again, that isn't the point of this tech. All that stuff needs to be able to compete against the AAA gaming libraries of the Intel/AMD Nvidia/AMD behemoths that use 10x the power but deliver more performance. And have for years.Will be really interesting to compare SD ELite X now with M3, M3 Pro and M3 Max with on-chip GPU memory with HW accelerated Mesh shading and Ray tracing! 😄
I think you're being too harsh on OP for saying that Qualcomm didn't focus on GPU performance. Apple designs their silicon to have GPU performance on-par with dGPUs becuase there's no alternative on the Mac side. If you want a top-performing GPU in an Apple product, the only option is a beefy integrated graphics solution. On the other hand, Qualcomm can partner with Nvidia or AMD for better graphics performance if need be.Still more than 3 times slower than M2 Max.
Using more power but slower than both M2 and M2 Max.
Still about 3 times slower than M2 Max despite the game is running through AGPTK or Crossover (not clear)
Thank you for saying this so well. I agree, I don’t think it would even make sense to compete on the high end of things in the windows world. Why compete with an i9 and 4090?I think you're being too harsh on OP for saying that Qualcomm didn't focus on GPU performance. Apple designs their silicon to have GPU performance on-par with dGPUs becuase there's no alternative on the Mac side. If you want a top-performing GPU in an Apple product, the only option is a beefy integrated graphics solution. On the other hand, Qualcomm can partner with Nvidia or AMD for better graphics performance if need be.
Of course, there are drawbacks and benefits to either approach, dGPUs will never achieve the same efficiency but it also can cut the cost of the device since the chip can be smaller, and Nvidia with CUDA have specific optimizations and years of driver support that Apple doesn't have.
Also: I'd like to point out that we don't know the pricing of laptops with this chip. If this is coming out in laptops with sub $2k price points, then they aren't even supposed to compete with the M2 Max. I'd suspect this to be either a base M3 competitor or maybe M3 Pro. Qualcomm may not desire to compete with the M3 Max at all.
I think you're being too harsh on OP for saying that Qualcomm didn't focus on GPU performance. Apple designs their silicon to have GPU performance on-par with dGPUs becuase there's no alternative on the Mac side. If you want a top-performing GPU in an Apple product, the only option is a beefy integrated graphics solution. On the other hand, Qualcomm can partner with Nvidia or AMD for better graphics performance if need be.
Of course, there are drawbacks and benefits to either approach, dGPUs will never achieve the same efficiency but it also can cut the cost of the device since the chip can be smaller, and Nvidia with CUDA have specific optimizations and years of driver support that Apple doesn't have.
Also: I'd like to point out that we don't know the pricing of laptops with this chip. If this is coming out in laptops with sub $2k price points, then they aren't even supposed to compete with the M2 Max. I'd suspect this to be either a base M3 competitor or maybe M3 Pro. Qualcomm may not desire to compete with the M3 Max at all.
I see you just joined the forum yesterday and this is your first post so you haven't had much experience of interacting with forum members. It’s not too harsh at all. People here have been mocking the performance of Apple Silicon ever since M1 by comparing it to power-hungry monster CPUs and GPUs from Intel and Nvidia. The same people complained again about Apple Silicon’s lack of power efficiency when A17 Pro was released when Apple prioritized performance over watt this time. It’s a common form of double binding used frequently in this forum by people with no real interest in Apple or its products.
After a while you will notice that some people never have anything positive to say about Apple or its products no matter what they present. They only create one thread and post after another to spread provoking and negative opinions by twisting and cherry-picking facts. In fact if you search the word ”sucks” you can find many old posts by the same people since the introduction of Apple Silicon telling everyone everything Apple sucks.
They say things like ”What happened to Apple in terms of chip development? A17 Pro consume up to 14W of power. More power = higher performance. Then what happened to the power by watt? That's extremely too high and that's what M1 consume. It proves that Apple Silicon chips GPU performance sucks. It just proves that Apple is not doing well with Apple Silicon development. Imagine if Qualcomm starts making their own M series chips for PC.”
Now that SD Elite X does the same and uses up to 80W like M2 Ultra they praise it for being fast. Now that Qualcomm has made ”their own M series chips for PC” and its GPU underperforms compared to even M2 in some benchmarks while using more power it suddenly doesn’t ”suck” but it’s only too old despite being a brand new unreleased chip. You can’t use common sense with such people. Only by using the same kind of rhetoric you can point out the flaws in their arguments.
The same people are now writing in other threads ”Well only 20% from M2 Max to M3 Max which is very disappointing” while Intel 14th gen CPUs like i9-14900K are only a few procent better than 13th gen like i9-13900K according to Puget Systems among others.