Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ArkSingularity

macrumors 6502a
Mar 5, 2022
928
1,130
Talk is cheap, and Intel has been doing a lot of talking with no substance to back it up. There have been rumblings about their plans to release a vanilla M1 competitor............years from now. I'll believe it when I see it, and by then M1 and all its variants will seem primitive compared to M3 or M4 or whatever else.

If I were a professional short seller or something like that, I'd be looking pretty hard at Intel. They're really starting to behave like RIM or Microsoft, where they think their user base is too loyal to ever think of moving to anything else. Eventually a competitor is going to disrupt you. If you weren't already planning for it, you get caught flat footed.

The reason things like this don't often happen to Apple is because they bet big on technologies they know are going to be great, even if it's YEARS too early to make them mainstream. In other words, the time to start working on an M1 competitor was 10 years ago, not 2022.

I think a lot of that has to do with the fact that chip designs take a few years from start to finish. The M1 caught them largely by surprise (and it shouldn't have, they've been watching the iPhone for years and knew the rumors about it coming to Mac), but it takes quite a while to actually get them from design to fab. Anything they're designing today will take a few years to actually bring to the market.

Assuming they play their hand right, Intel will have a chance to take big leaps forward in the next couple of years once Intel 4 starts going into production (assuming it doesn't get delayed again). Intel has been forced to find ways of bringing performance improvements without the advantage of leading edge fabs, and those optimizations won't go away once they move to new fabs. They're still a LONG way away from what Apple has done, but Intel isn't throwing in the towel yet. It'll take a few years to see how it all plays out long term.
 

Macative

Suspended
Mar 7, 2022
834
1,319
Actually I'm more interested in all the comments from the people who have spent 2 years saying Apple can't compete with discrete GPUs.

M1 Ultra's GPU is on par with the Nvidia GeForce RTX 3090, the literal best graphics card available, while consuming 200 fewer watts.

Where are all the know-it-alls today?
 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
Intel will probably release something in the near future to challenge the M1 Ultra, just like they did with the M1 Max (but at a much higher power consumption ofcourse).

So instead of a 14-core intel CPU (that beat the M1 Max), they will then come with a 28-core version I suppose to try beat the M1 Ultra.
Intel should focus in shipping Alder lake for laptops first.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Actually I'm more interested in all the comments from the people who have spent 2 years saying Apple can't compete with discrete GPUs.

M1 Ultra's GPU is on par with the Nvidia GeForce RTX 3090, the literal best graphics card available, while consuming 200 fewer watts.

Where are all the know-it-alls today?
I’m gonna wait until third party tests have come out, but otherwise I’m happy to eat my words.
 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,638
Indonesia
I think a lot of that has to do with the fact that chip designs take a few years from start to finish. The M1 caught them largely by surprise (and it shouldn't have, they've been watching the iPhone for years and knew the rumors about it coming to Mac), but it takes quite a while to actually get them from design to fab. Anything they're designing today will take a few years to actually bring to the market.

Assuming they play their hand right, Intel will have a chance to take big leaps forward in the next couple of years once Intel 4 starts going into production (assuming it doesn't get delayed again). Intel has been forced to find ways of bringing performance improvements without the advantage of leading edge fabs, and those optimizations won't go away once they move to new fabs. They're still a LONG way away from what Apple has done, but Intel isn't throwing in the towel yet. It'll take a few years to see how it all plays out long term.
Intel has plenty of time because Windows is practically stuck at x86 as ARM and Qualcomm just cannot make an SoC for Windows that can perform as well as the M1. Since Apple only makes Apple Silicon for their own machines, the one that is actually forcing intel imo is AMD as AMD is now good on the portables, the market where intel used to rule.
 

ArkSingularity

macrumors 6502a
Mar 5, 2022
928
1,130
Actually I'm more interested in all the comments from the people who have spent 2 years saying Apple can't compete with discrete GPUs.

M1 Ultra's GPU is on par with the Nvidia GeForce RTX 3090, the literal best graphics card available, while consuming 200 fewer watts.

Where are all the know-it-alls today?

Apple's approach is genius too. It's incredible just how low the power load actually is on these GPU's.

What I suspect Apple has probably done is to set the clock speed lower for each execution unit/pipeline, but to massively scale up the quantity of them to make up for it. This would be a genius design if so, because power consumption doesn't scale linearly with clock speed (double the clock, more than double the power). They've created a GPU that can match top-of-the-line gaming GPUs (ignoring cards like the A100 because it isn't even designed for gaming), and they've done so at only a fraction of the power consumption.

Everyone else is probably looking at it in sheer horror at the transistor counts in these chips, but Apple just managed to do it successfully. And everyone else is scrambling to catch up.

Intel has plenty of time because Windows is practically stuck at x86 as ARM and Qualcomm just cannot make an SoC for Windows that can perform as well as the M1. Since Apple only makes Apple Silicon for their own machines, the one that is actually forcing intel imo is AMD as AMD is now good on the portables, the market where intel used to rule.

That is a good point. Intel isn't really competing with the M1, they're primarily competing with AMD. That being said, AMD is giving them quite a run for their money right now. Intel is definitely not going to want to lose the performance crown long term, and if they fall further behind, more professionals will start switching away from PCs and into the Mac ecosystem.
 

ArkSingularity

macrumors 6502a
Mar 5, 2022
928
1,130
“Benchmarks/Transistor count” is like debating “Horsepower/Liter” in car circles. Completely pointless benchracing circlejerk.
Oh I don't disagree, that's more or less my entire point. Apple made monstrous chips that the rest of the industry was afraid to make, and proved that they could still be power efficient.
 

Rigby

macrumors 603
Aug 5, 2008
6,257
10,215
San Jose, CA
Intel will probably release something in the near future to challenge the M1 Ultra, just like they did with the M1 Max (but at a much higher power consumption ofcourse).

So instead of a 14-core intel CPU (that beat the M1 Max), they will then come with a 28-core version I suppose to try beat the M1 Ultra.
Their next generation "Raptor Lake" CPUs (due later this year) are rumored to have 24 physical cores:


In Geekbench, M1 Ultra is a little slower than Alder Lake in single-thread and about 33% faster in multi-thread performance. By upping the core count by 50%, Raptor Lake should easily beat the M1 Ultra again in terms of performance. Obviously in terms of power efficiency they will still lag behind.
 

MayaUser

macrumors 68040
Nov 22, 2021
3,178
7,200
i just pre-ordered the next intel M1 ultra counter Xeon but i dont know i can handle the power consumption
Probably my electricity panel will shut down when i drive that thing to the max
Intel recommends me to use 2.4 kilowatts power supply. I guess when your lights in your city turns down, you can curse me because i turn on the Intel counter Xeon
 
  • Like
Reactions: JMacHack

Macative

Suspended
Mar 7, 2022
834
1,319
I’m gonna wait until third party tests have come out, but otherwise I’m happy to eat my words.
What could third party tests possibly prove otherwise? That it's almost as good as the best discrete graphics card in the world? Lol.
 

MysticCow

macrumors 68000
May 27, 2013
1,564
1,760
Intel will probably release something in the near future to challenge the M1 Ultra, just like they did with the M1 Max (but at a much higher power consumption ofcourse).

So instead of a 14-core intel CPU (that beat the M1 Max), they will then come with a 28-core version I suppose to try beat the M1 Ultra.

We have this new Core 1MX processor that will do twice the power of an M1 Ultra! You need your own nuclear reactor to power this new chip, BUT IT'S FAST!!!!
 
  • Like
Reactions: JMacHack

Fragment Shader

macrumors member
Oct 8, 2014
33
37
Toronto
What could third party tests possibly prove otherwise? That it's almost as good as the best discrete graphics card in the world? Lol.

Because performance can vary significantly among different workloads, no one would sensibly judge a product based on one benchmark. The poster is simply waiting for more data but is obviously open for his earlier assumptions to be proven wrong.
 

Fragment Shader

macrumors member
Oct 8, 2014
33
37
Toronto
Apple's approach is genius too. It's incredible just how low the power load actually is on these GPU's.
That's the thing with these M1/Alder Lake comparisons. It's fine to use Cinebench to compare basic CPU performance, but an Alder Lake that competes with an M1 Ultra is not only having to do so with a much higher TDP by itself, but you really have to factor in the TDP + cost of the GPU needed to also compete in those tasks.

There just is nothing Intel can offer that really is comparable to the M1 as a package overall. Perhaps they will in the future, I actually want the PC industry to really up its game with SOC's - I thought AMD would actually be here now years ago when HBM hit the market and that although it wouldn't be a truly unified design like the M1, we would at least have multi-chip/package designs with far powerful GPU's and on-die dedicated memory by now.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
In Geekbench, M1 Ultra is a little slower than Alder Lake in single-thread and about 33% faster in multi-thread performance. By upping the core count by 50%, Raptor Lake should easily beat the M1 Ultra again in terms of performance. Obviously in terms of power efficiency they will still lag behind.
Sounds reasonable, but st performance in any M1-derived processor is equal being that the cores are all the same.
 

Bug-Creator

macrumors 68000
May 30, 2011
1,783
4,717
Germany
But the real answer will come next year when Intel switches to 3nm.

Maybe I missed something, but AFAIK TSMC is the only fab even close to mass produce 3nm and that has all been pre booked by Apple.

Intel itself has just gone from 14nm to 10nm (which is kinda like 7nm for TSMC) so jumping 2-3 steps in 1 year seems a tad bit optimistic.
 
  • Like
Reactions: JMacHack

huge_apple_fangirl

macrumors 6502a
Aug 1, 2019
769
1,301
Actually I'm more interested in all the comments from the people who have spent 2 years saying Apple can't compete with discrete GPUs.

M1 Ultra's GPU is on par with the Nvidia GeForce RTX 3090, the literal best graphics card available, while consuming 200 fewer watts.

Where are all the know-it-alls today?
Apple claims it is... let's wait for benchmarks. Because they claimed M1 Max would be close to a 3080, and outside of some specific accelerated tasks- it's just not. As for power, Apple is on TSMC 5nm and NVIDIA is on Samsung 8nm. That's two nodes apart! Even if NVIDIA's design were just as efficient as Apple's (it likely is not) it would still consume waaay more power. Plus Apple is using much more transistors to get performance similar to NVIDIA. Don't get me wrong, Apple's GPUs are great, and they've made tremendous considering they started designing them just a few years ago. But just like it took Apple years to become the undisputed leader in CPU, they aren't the leader in GPU- yet.
 

Fragment Shader

macrumors member
Oct 8, 2014
33
37
Toronto
Apple claims it is... let's wait for benchmarks. Because they claimed M1 Max would be close to a 3080, and outside of some specific accelerated tasks- it's just not.
Are there specific M1-native apps that you're referring to where a 3080 dusts it, or are you referring to game benchmarks? There are just too many factors in judging the GPU performance on Macs by using games, different API's, translation layers, etc.

I mean if you want high-performance gaming no one would deny a PC with a dedicated GPU is where it's at, but for the market Apple is actually targeting I would place priority on overall app performance where the GPU is involved. Apps like Infinity Pro (especially ones that take advantage of the unified architecture) where an M1 Max outperformed a $6k W6900X for example.
 

huge_apple_fangirl

macrumors 6502a
Aug 1, 2019
769
1,301
If I were Intel, I'd be focusing primarily on the efficiency cores for multithreaded performance. They are significantly more efficient, so they could stuff a huge number of them onto one chip without blowing the TDP into the 300-400+ watt range.
Intel's "efficiency" cores are misleading. They are not like Apple's "LITTLE" cores that are genuinely power efficient. They are area efficient- decent performance for much less area than a P-core. They are there so Alder Lake is manufacturable, not for power efficiency. Any efficiency gains in Alder Lake are from the new process node, not the E-cores. Apple's P-cores are much more efficient than Intel's E-cores.
A chip with 16 performance cores and 48 efficiency cores would be a screamer on multicore workloads, but the marketing team would have to find a way to sell a high-performance chip with more efficiency cores than performance cores.
intel-laptop-1-2.jpeg

They are already putting in more E-cores than P-cores in laptop chips for exactly the reasons you pointed out- multithreaded workloads (without having to make huge chips).
 

Fragment Shader

macrumors member
Oct 8, 2014
33
37
Toronto
Maybe I missed something, but AFAIK TSMC is the only fab even close to mass produce 3nm and that has all been pre booked by Apple.

Intel itself has just gone from 14nm to 10nm (which is kinda like 7nm for TSMC) so jumping 2-3 steps in 1 year seems a tad bit optimistic.
Intel's new fabs aren't expected to be fully operational until 2025 regardless.
 

huge_apple_fangirl

macrumors 6502a
Aug 1, 2019
769
1,301
Are there specific M1-native apps that you're referring to where a 3080 dusts it, or are you referring to game benchmarks? There are just too many factors in judging the GPU performance on Macs by using games, different API's, translation layers, etc.
Well which apps are M1 native and NVIDIA cards? macOS does not support their GPUs anymore. You must do cross-platform benchmarks, which obviously are flawed. In those NVIDIA is generally ahead. You're right that in game benchmarks it's very biased against M1- many games are not optimized for macOS, Metal, or ARM so it doesn't tell you much. But in the version Blender optimized for M1 (even though it is beta) it is way, way behind the version optimized for NVIDIA's CUDA. Also consider the fact that Apple is using many, many more transistors to compete with NVIDIA's performance. Now Apple can afford to do this. But still. That's not to say Apple's GPUs are bad- they're not. It's shocking how good they are considering how recently Apple got into the game. And Apple is improving rapidly. In a few years, and with hopefully more developers adopting Metal, Apple will no doubt claim the crown.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.