Which slides in particular?
Sheesh. What's going on buddy? Take a chill pill. You're attacking almost everyone who isn't all in on your speculation. Take a deep breath and go for a walk. No one is trying to say you're wrong or you're an idiot.All CPU related slides. Watch the keynote.
Would you explain for the non-specialists here?All CPU related slides. Watch the keynote.
Sheesh. What's going on buddy? Take a chill pill. You're attacking almost everyone who isn't all in on your speculation. Take a deep breath and go for a walk. No one is trying to say you're wrong or you're an idiot.
Would you explain for the non-specialists here?
We appreciate the insights that you bring. But you've often moved towards the terrorizing side rather than the helpful side. I'm just giving you this feedback from my perspective. You can do what you want with it including nothing at all.I am simply trying to squish misinformation before it spreads. People already start saying that Dynamic Caching is a RAM partitioning scheme or that M3 uses A16 cores. If one doesn’t shush these kind of nonsensical statements immediately it will end up on Wikipedia and then good luck removing it from there. I already tried with M1 GPUs.
We appreciate the insights that you bring. But you've often moved towards the terrorizing side rather than the helpful side. I'm just giving you this feedback from my perspective. You can do what you want with it including nothing at all.
No, not for the Ultra. There's no reason (yet) to think the Ultra will be anything but a 24P + 8E dual-chip, just like the M1 and M2 generations were duals.Sorry, yes, 24P. You really think based on M3 family that Apple will go for the Ultra with 48P 512gb Ram and 160 gpu cores on the Ultra? or you are saying maybe this will be for the "Extreme M3"
Or you could be a bit less sensitive.We appreciate the insights that you bring. But you've often moved towards the terrorizing side rather than the helpful side. I'm just giving you this feedback from my perspective. You can do what you want with it including nothing at all.
Here is my perspective on your posts: you want to toss out uninformed, inaccurate nonsense and pretend that it deserves respect? Be prepared for pushback instead, and don't get all bent out of shape when it happens.We appreciate the insights that you bring. But you've often moved towards the terrorizing side rather than the helpful side. I'm just giving you this feedback from my perspective. You can do what you want with it including nothing at all.
Huh? I think you are mistaken leman for somebody else. 😅But you've often moved towards the terrorizing side rather than the helpful side.
Huh? I think you are mistaken leman for somebody else. 😅
Lol. I'd like to see some of these people get into an argument with Linus Torvalds. Tears would really flow!No, no, I can be very unpleasant. Part of my charm I suppose.
At least with me, you’ve been pretty respectful and polite -so far-, and I tried to let you know about it in some post where we’ve interacted. And I honestly think that’s a better attitude to contribute to a healthy forum environment, where we all learn from each other (some more than others) without ridiculing people who may have less technical knowledge than others. But I am aware that not everyone shares the same personal values. So, I’ll keep doing what I do: engaging in constructive conversation with those who treat me with a minimum of respect -or even just being really neutral like you-, and not engaging with those that, despite providing useful knowledge, use a passive-aggressive tone or just a belligerent tone.No, no, I can be very unpleasant. Part of my charm I suppose.
That’s my initial perception as well, yes.So is the general consensus (absent proper benchmarks) that:
- M3 Max looks to be a good upgrade over M2 Max for both CPU (50% more P Cores than M2 Max) and GPU (40 ray-tracing cores vs 38 non-ray tracing cores), but only if you pony up for the top model
- M3 Pro regresses on the CPU cores and memory bandwidth and is likely to have little to offer over M2 Pro unless you can take advantage of the new GPU features. Query whether in some situations it will be slower than M2 Pro and how edge case these are?
- M3 is an incremental upgrade over M2, keeping the same CPU and GPU cores, so leaning heavily on the better GPU and a small boost to CPU performance
The CPU and GPU cores are not the same between the M2 and M3.M3 is an incremental upgrade over M2, keeping the same CPU and GPU cores
Yep, they said several times “this upgrade is ideal for people with Intel macs” “11 times faster” during the presentation. As if they were implicitly saying “just throw that burning garbage already and jump into the Apple Silicon gen?” 😆To comment on the discussion: I don’t think Apple expects many to upgrade from M2 to M3. They assume upgrade times of 3 or more years, which is clearly reflected in their marketing materials. Hence focus on comparison with Intel Macs.
I assumed they meant core count...The CPU and GPU cores are not the same between the M2 and M3.
From a product perspective, I also thought the push from Intel was interesting. Especially considering the 27" replacement option (if you are willing to give up an AIO) is so much more expensive.Yep, they said several times “this upgrade is ideal for people with Intel macs” “11 times faster” during the presentation. As if they were implicitly saying “just throw that burning garbage already and jump into the Apple Silicon gen?” 😆
They also compared the M3 with the M1 and the M2 on some sheets, but they always emphasized the comparison towards the M1 rather the M2.
I wonder if this emphasis on jumping from Intel macs to Apple Silicon macs will be encouraged removing support for Intel macs from macOS in a more steep pace… we’ll see.