Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
It’s getting a little silly watching you guys claim “apple had the best single core performance” and then, when someone says ”nope - Intel has been higher single core performance the whole time” your position suddenly is “most-efficient energy use”.
Except it is not true. The higher end 11th gen core i9s came out months after the M1. So originally the M1 did have the best single core performance—at least in GeekBench. I'm surprised no one has disputed the GeekBench 5 results. Benchmarks are mostly useless anyway.
 
  • Like
Reactions: eltoslightfoot

eltoslightfoot

macrumors 68030
Feb 25, 2011
2,547
3,099
It’s getting a little silly watching you guys claim “apple had the best single core performance” and then, when someone says ”nope - Intel has been higher single core performance the whole time” your position suddenly is “most-efficient energy use”.
Then you have nothing to worry about. You think Apple's M1 and M2 are bad compared to Intel. AMD is not a competitor either, so just go with that and be happy. Don't sweat intel's current situation at all. They are clearly tip-top. :)

Quite honestly it is getting a little tedious in this thread. If someone can't see what is going on with Apple's M line and intel's 10 nanometer band-aid--not to mention AMD, then I have nothing for them.
 
  • Like
Reactions: jdb8167

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
I bought an i7-10700 as I wanted to limit power consumption and 65 watts was a reasonable compromise for 8 cores. Then I was monitoring it and saw power usage goes up to 90 watts. And then I learned about PL2. The M1 mini running at 16 watts for its PL2 was an eye-opener.

I deliberately didn't get the 10700K or 10900 for power savings.

I also run my 2021 MacBook Pro always on Low Power Mode and the performance cores are rarely used. Which means that I don't actually need really high Geekbench scores for my use. Single-core scores of 500 or higher are fine for what I do.
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
While I am reluctant to post anything from Max Tech, when they stay in their lane, they do a decent job. Vadim has laid out how badly Intel has performed with perf/watt, how Meteor Lake's delay is going to hurt the company, and that Apple is likely to initially be TSMC's only major 3nm customer because of that delay.


Also, he notes that it's possible, if not likely, that the M2 Pro/Max/Ultra/Extreme are going to be on 3nm, something backed up by Opteron architect Cliff Maier, who knows the engineers at Apple from his time at AMD and Exponential. When I asked him to further elaborate, he replied with:

"Based on whispers, I am beginning to think that as unlikely as it seems, the next M2 variants are 3nm. I don’t understand how, but that seems to be the buzz."
 
  • Like
Reactions: eltoslightfoot

jdb8167

macrumors 601
Nov 17, 2008
4,859
4,599
While I am reluctant to post anything from Max Tech, when they stay in their lane, they do a decent job. Vadim has laid out how badly Intel has performed with perf/watt, how Meteor Lake's delay is going to hurt the company, and that Apple is likely to initially be TSMC's only major 3nm customer because of that delay.


Also, he notes that it's possible, if not likely, that the M2 Pro/Max/Ultra/Extreme are going to be on 3nm, something backed up by Opteron architect Cliff Maier, who knows the engineers at Apple from his time at AMD and Exponential. When I asked him to further elaborate, he replied with:
I always thought that was a possibility though admittedly I expected Apple to call the TSMC N3 SoCs the M3. M2 for MacBook Air and iPads and then M3 Pro, M3 Max, and M3 Ultra for the rest. It makes sense based on the TSMC timetables for N5P and N3 as long as the plan is to have the Pro, Max, and Ultra lines out next spring.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
I always thought that was a possibility though admittedly I expected Apple to call the TSMC N3 SoCs the M3. M2 for MacBook Air and iPads and then M3 Pro, M3 Max, and M3 Ultra for the rest. It makes sense based on the TSMC timetables for N5P and N3 as long as the plan is to have the Pro, Max, and Ultra lines out next spring.

They could just jump to M3 from M2 Air and M2 Pro. It would make buyers feel like they would be getting something more - which, in this case, is true. I do not plan to upgrade my 2021 MacBook Pro but I could be teased if it had much better battery life.
 

TinyMito

macrumors 6502a
Nov 1, 2021
862
1,224
First time seeing this thread, I find it funny when people say "not looking pretty for Apple" lol it's entirely Apple and Orange comparison.
 
  • Like
Reactions: Tagbert

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
It looks like things are not so looking so pretty at Intel.

Intel ARC is the strangest story.

It was rumored to be out what, two years ago? Intel is actually pretty good in integrated graphics and there was a huge demand at the low-end to the midrange during the crypto-boom. They could have made low-end cards and sold as many as they wanted to because those were getting priced at 2xMSRP in AMD and nVidia. I suspect that Intel wanted a slice of the midrange and high-end and I guess that they've run into a lot of problems. It would have been a lot easier to just scale their iGPUs to double and they would have had the low-end market to themselves.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
Intel ARC is the strangest story.

It was rumored to be out what, two years ago? Intel is actually pretty good in integrated graphics and there was a huge demand at the low-end to the midrange during the crypto-boom. They could have made low-end cards and sold as many as they wanted to because those were getting priced at 2xMSRP in AMD and nVidia. I suspect that Intel wanted a slice of the midrange and high-end and I guess that they've run into a lot of problems. It would have been a lot easier to just scale their iGPUs to double and they would have had the low-end market to themselves.
Assuming there isn't an actual problem with the scheduler hardware, Intels biggest issue right now is poor drivers. With iGPU they never had to really worry about getting games working, but they do with dGPU (otherwise why bother, right).
 

dmr727

macrumors G4
Dec 29, 2007
10,667
5,766
NYC
Intel ARC is the strangest story.

It really is - if not for Ars I wouldn't know much about it. It'd be cool to have a competitive alternative to AMD and nVidia, and I don't even necessarily mean on the high end. Just something with good midrange performance.
 
  • Like
Reactions: eltoslightfoot

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
Intel ARC is the strangest story.

It was rumored to be out what, two years ago?

No, not Arc ( 'DG2' ) two years ago.

You might be thinking of DG1.

https://www.theverge.com/2020/1/9/2...crete-gpu-graphics-card-announcement-ces-2020

DG1 was just the same iGPU in Tiger Lake slapped into a package so Intel would have a public test mule to gather feedback on. However, in 20/20 hindsight it should have been an indicator that drivers were going to be a problem. It only worked with special firmware and the driver of the iGPU hardware wanted very much to be be treated like an iGPU (tight coupling with CPU package. ). At very least, the expectation that resizable BAR (reBAR) was going to play a very big role.

However, Intel has been running "we are seriously coming for the dGPU" hype train for more than two years though. Hired Raja Koduri back 2017-2018. There were projects and flight before Koduri arrived but Intel was pushing the visibility up each year since 2018. ( it is a nice offset to the increasing problems they were having on the CPU package side. )


DG2 was likely suppose to be out Q4 2021 in pre-pandemic planning. ( Pandemic likely shot giant holes into their coordinated , hyper optimistic software efforts. ) Intel has been running the behind so that will be almost a year late for the 'sexy high mid range' stuff.



Intel is actually pretty good in integrated graphics and there was a huge demand at the low-end to the midrange during the crypto-boom.

That's a bit of a dual edged sword. Their driver stack has so much implicit iGPU assumptions piled into it that chasing down all of those assumptions (and iGPU remaining the vast bulk of their 'GPU' business ) is a liability as well as an asset.

Intel probably would have been better served to just focus on mobile GPUs first. Maybe some "Pro" GPU cards where don't have to chase after every quirky API option in dozens of different games and APIs and bigger value premium put on stability. Leave the very large discrete card to the Xe-HPC (Ponte Vecchio ) scope where there is pragmatically no video out to worry about ( primarily all about GPGPU computational workloads).

By including the midrange market than scooped up a major obligation to cover an extremely wide set of gaming issues. It is balkanized: DX11 , DX12 , and Vulkan .

Intel's iGPU drivers were "good" in part because the scope they tried to cover wasn't overly broad. ( they were not particularly good in the sense of very high optimized while being extremely stable. ). Nor were they particularly very early adopters of the Vulkan/DX12/Metal model of shifting lots of optimizations decisions into the application (or at best shared "game/render" engine. ) .

Instead what Intel did on software side is go after CUDA library strengths ( with OpenAPI ). Prioritize effort onto DX12 ( where had thinnest expertise depth ). Spend lots of time on trying to couple the GPU to new Intel CPUs/iGPUs in a discrete card context.


They could have made low-end cards and sold as many as they wanted to because those were getting priced at 2xMSRP in AMD and nVidia.

The margins on low end cards is very thin. Pretty good chance their push to grab a larger share of the mid-range margin was to raise overall aggregate margin across the GPU products line. Getting into GPUs was going to be expensive. All the production is contracted ( more expensive than doing it internally) and using a bunch of externally developed EDA tools (not that the internal ones were better, but probably 'cheaper' if viewed through penny pinching internal account lenses). Lots more software (which likely means lots more bodies and overhead need to pay more . )

If the "start up" costs for the dGPU business is $700M then targeting 4M GPUs with average margin of $50 allows better amortization than targeting 2M GPUs with average margin of $25 . That would be true if don't count for the giant debacle a huge software blunder could ( is ) costing them. 2M very , very small GPUs wouldn't give them much negotiating leverage with TSMC either as it is a much smaller aggregate wafer order.





I suspect that Intel wanted a slice of the midrange and high-end and I guess that they've run into a lot of problems. It would have been a lot easier to just scale their iGPUs to double and they would have had the low-end market to themselves.

The very, very high end of computational data center cards ? Yes. The high end gaming card market? No.
Intel may have used "enthusiast" is a loose way to refer to 3060-3070/5600-5700 performance , but for folks not trying to wear rose colored glass they haven't been shooting that the upper mid - high range at all.

There initial talk was there was a range of Xe products. In 2020, they had this chart.


Intel-4_25.jpg



Earlier before that there was just Xe-LP , Xe-HP , Xe-HPC. ( go back and look at the Xe product range slide in the DG1 article linked in earlier. ).

Xe-HP and Xe-HPC were more data center focused cards. HP was skewed toward video en/decode and server room 'display' hosting. HPC was very much focused on supercomputer compute.

That Xe-HPG thing crept in later when the Xe-HP card ran into problems. Some folks point to the diagram and say the Xe-HPG was suppose to be cover both Mid-range and Enthusiast in the first generation. I don't think so. At best that "enthusiast" was a long term aspirational thing. There was little practical way they could do both Xe-HP and a whole set of high end Xe-HPG at the same time. They might have hoped to push the server/workstation card done into commercial market as a placeholder, but their plans never were about having high end Nvidia killer on generation one. When the Xe-HP collapsed that was almost certainly going to be a short term gap. They can't have whole GPU chips disappearing and not have gaps open up in the line up; at very least on the first generation.

Even generation 2 I doubt they would cover the whole top end. Falling a bit short of 3070 land this round. Just covering "3070-3080 land" would be an expansion. And there is tons of cruft to clean up at the Xe-HPC level... which also would take loads of resources (money , people , and time).

High end gaming never really was on the roadmap until relatively recent. And even there it is squishy.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
No, not Arc ( 'DG2' ) two years ago.

You might be thinking of DG1.

https://www.theverge.com/2020/1/9/2...crete-gpu-graphics-card-announcement-ces-2020

DG1 was just the same iGPU in Tiger Lake slapped into a package so Intel would have a public test mule to gather feedback on. However, in 20/20 hindsight it should have been an indicator that drivers were going to be a problem. It only worked with special firmware and the driver of the iGPU hardware wanted very much to be be treated like an iGPU (tight coupling with CPU package. ). At very least, the expectation that resizable BAR (reBAR) was going to play a very big role.

However, Intel has been running "we are seriously coming for the dGPU" hype train for more than two years though. Hired Raja Koduri back 2017-2018. There were projects and flight before Koduri arrived but Intel was pushing the visibility up each year since 2018. ( it is a nice offset to the increasing problems they were having on the CPU package side. )


DG2 was likely suppose to be out Q4 2021 in pre-pandemic planning. ( Pandemic likely shot giant holes into their coordinated , hyper optimistic software efforts. ) Intel has been running the behind so that will be almost a year late for the 'sexy high mid range' stuff.





That's a bit of a dual edged sword. Their driver stack has so much implicit iGPU assumptions piled into it that chasing down all of those assumptions (and iGPU remaining the vast bulk of their 'GPU' business ) is a liability as well as an asset.

Intel probably would have been better served to just focus on mobile GPUs first. Maybe some "Pro" GPU cards where don't have to chase after every quirky API option in dozens of different games and APIs and bigger value premium put on stability. Leave the very large discrete card to the Xe-HPC (Ponte Vecchio ) scope where there is pragmatically no video out to worry about ( primarily all about GPGPU computational workloads).

By including the midrange market than scooped up a major obligation to cover an extremely wide set of gaming issues. It is balkanized: DX11 , DX12 , and Vulkan .

Intel's iGPU drivers were "good" in part because the scope they tried to cover wasn't overly broad. ( they were not particularly good in the sense of very high optimized while being extremely stable. ). Nor were they particularly very early adopters of the Vulkan/DX12/Metal model of shifting lots of optimizations decisions into the application (or at best shared "game/render" engine. ) .

Instead what Intel did on software side is go after CUDA library strengths ( with OpenAPI ). Prioritize effort onto DX12 ( where had thinnest expertise depth ). Spend lots of time on trying to couple the GPU to new Intel CPUs/iGPUs in a discrete card context.




The margins on low end cards is very thin. Pretty good chance their push to grab a larger share of the mid-range margin was to raise overall aggregate margin across the GPU products line. Getting into GPUs was going to be expensive. All the production is contracted ( more expensive than doing it internally) and using a bunch of externally developed EDA tools (not that the internal ones were better, but probably 'cheaper' if viewed through penny pinching internal account lenses). Lots more software (which likely means lots more bodies and overhead need to pay more . )

If the "start up" costs for the dGPU business is $700M then targeting 4M GPUs with average margin of $50 allows better amortization than targeting 2M GPUs with average margin of $25 . That would be true if don't count for the giant debacle a huge software blunder could ( is ) costing them. 2M very , very small GPUs wouldn't give them much negotiating leverage with TSMC either as it is a much smaller aggregate wafer order.







The very, very high end of computational data center cards ? Yes. The high end gaming card market? No.
Intel may have used "enthusiast" is a loose way to refer to 3060-3070/5600-5700 performance , but for folks not trying to wear rose colored glass they haven't been shooting that the upper mid - high range at all.

There initial talk was there was a range of Xe products. In 2020, they had this chart.


Intel-4_25.jpg



Earlier before that there was just Xe-LP , Xe-HP , Xe-HPC. ( go back and look at the Xe product range slide in the DG1 article linked in earlier. ).

Xe-HP and Xe-HPC were more data center focused cards. HP was skewed toward video en/decode and server room 'display' hosting. HPC was very much focused on supercomputer compute.

That Xe-HPG thing crept in later when the Xe-HP card ran into problems. Some folks point to the diagram and say the Xe-HPG was suppose to be cover both Mid-range and Enthusiast in the first generation. I don't think so. At best that "enthusiast" was a long term aspirational thing. There was little practical way they could do both Xe-HP and a whole set of high end Xe-HPG at the same time. They might have hoped to push the server/workstation card done into commercial market as a placeholder, but their plans never were about having high end Nvidia killer on generation one. When the Xe-HP collapsed that was almost certainly going to be a short term gap. They can't have whole GPU chips disappearing and not have gaps open up in the line up; at very least on the first generation.

Even generation 2 I doubt they would cover the whole top end. Falling a bit short of 3070 land this round. Just covering "3070-3080 land" would be an expansion. And there is tons of cruft to clean up at the Xe-HPC level... which also would take loads of resources (money , people , and time).

High end gaming never really was on the roadmap until relatively recent. And even there it is squishy.
They need something to do with the dies that don't make the Xe-HPC cut...
 

deconstruct60

macrumors G5
Mar 10, 2009
12,493
4,053
They need something to do with the dies that don't make the Xe-HPC cut...

Those are different dies. Has a different mix of 64-bit units and matrix units. Plus completely coupled to the exotic RAMBO caching subsystem.

A hefty portion of the design is shared, but there are deliberate gaps.

Xe-HP I think that maybe someone may have been handwaving inside of intel where with different design constraints they could do some overlap and "hand me downs". But that isn't what they scoped out several years ago.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
Those are different dies. Has a different mix of 64-bit units and matrix units. Plus completely coupled to the exotic RAMBO caching subsystem.

A hefty portion of the design is shared, but there are deliberate gaps.

Xe-HP I think that maybe someone may have been handwaving inside of intel where with different design constraints they could do some overlap and "hand me downs". But that isn't what they scoped out several years ago.
Fair. I still feel like drivers are their biggest weakness, the hardware scores are mostly fine, but show stopping bugs in games is not.
 

timothytripp

macrumors newbie
Feb 5, 2009
13
3
Source

The i9-13900K chip will be out later this year and we now have Geekbench results. Single core: 2133 and Multi core: 23701

In comparison, the M2 in the new MacBook Pro scored: 1919 in single core
8929 in multi core.

Sure, Apple is much better at performance per watt than Intel but it’s not a good look to fall behind in single core performance. Most day to day tasks are single core.

Apple upended the chip industry with the M1 but AMD and Intel came back swinging and it seems like Apple now needs to pull another rabbit out of the hat with the M3.
Personally I use laptops, and that really changes things. I happen to have 2 MacBook Pros side-by-side, and I use both of them throughout the day roughly equally. One is an i9 2.3GHz 8 core and the other is an M1 Pro. Performance feels roughly the same between these machines, with the exception that the M1 Pro is significantly faster at compiling code for mobile apps.

However the BIG things I notice between these two machines is that it's now 1pm and I started about 8am. My i9 has 35% battery left and has run hot all day (without any kind of iOS simulator running) where the M1 Pro has 87% battery left and has been cool to the touch all day (even with an iOS simulator running all day). This is pretty typical of my daily experience. I generally have to charge the i9 to get through the day, while the M1 Pro usually ends the day with 60% or more battery left and has never been hot to have on my lap.

Personally, I don't like Intel's approach to speed at any power/heat cost. I think the engineers at Intel are brilliant, talented people and if they would invest this brilliance in performance/watt they could steer the entire industry away from these CPU benchmarks that don't really matter anyway. My i9 sits at about 3-4% utilization and still sucks my battery dry and manages to stay pretty hot.

I know gaming is a key reason Intel and AMD strive for the best benchmarks, but gaming isn't really CPU bound the way it's GPU bound, and even video encoding, which certainly peaks out CPUs, is more of a GPU operation (at least it should be).

I'm anxious to see what the M2 Pro looks like, and I think the overall Geekbench scores for the M2 Ultra will be amazing but it's definitely not the deciding factor on where I spend my money any more. I want a laptop that will last at least a full day and preferably 2, without wasting heat and the Apple chips are just far ahead. Not that you can run a decent version of Windows on an ARM architecture yet, but I think that'll come and hopefully soon Intel will offer an APU with really good performance/watt.
 
  • Like
Reactions: Tagbert

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I know gaming is a key reason Intel and AMD strive for the best benchmarks, but gaming isn't really CPU bound the way it's GPU bound, and even video encoding, which certainly peaks out CPUs, is more of a GPU operation (at least it should be).
Gaming has become more CPU bound than it used to be. As long as everyone had a quad-core CPU and single-core performance increased slowly, you could keep the same CPU for many GPU generations. But today, 6 physical cores is the most common option in Steam hardware survey, and 8 cores is also popular, which gives game developers new resources to utilize. And because gamers buy high refresh rate monitors, 60 fps no longer means "perfect" performance. When people are looking for higher frame rates, CPU speed becomes more important.
 
  • Like
Reactions: pdoherty

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
Have you guys seen the "Extreme Performance mode" for the 13th gen line? Allows the chips to pull 350W (word on the street is the 13900K can do all p-core clock of 6.2 Ghz with this mode). Sheesh.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
Have you guys seen the "Extreme Performance mode" for the 13th gen line? Allows the chips to pull 350W (word on the street is the 13900K can do all p-core clock of 6.2 Ghz with this mode). Sheesh.

I am going to order a wall power measurement device and audit appliances in the house. I have a 2014 iMac 27 and an M1 Mac mini and I'm prepared to be shocked at the difference in power consumption.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I am going to order a wall power measurement device and audit appliances in the house. I have a 2014 iMac 27 and an M1 Mac mini and I'm prepared to be shocked at the difference in power consumption.
My coworker and I were talking about the 13th gen. He wants to upgrade, I figure it isn't worth it. Especially having to go water cooling. My current gaming rig pulls 550ish watts from the wall, and I am unexcited about the power requirements of the 7900 I am sure I will end up with this year.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
My coworker and I were talking about the 13th gen. He wants to upgrade, I figure it isn't worth it. Especially having to go water cooling. My current gaming rig pulls 550ish watts from the wall, and I am unexcited about the power requirements of the 7900 I am sure I will end up with this year.

I have a tenth gen and it gets the job done with a 550 Watt PSU. I figure my system uses about 300 watts total. This stuff with PL2 is just crazy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.