Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
The issue is they could be viewed as falling behind if they roll out an M2 pro/max/ultra that can't compete with the latest offerings from intel and AMD.
Even if they do fall behind, they’ll never NOT be making the fastest, most stable Macs that folks can legally buy. :)
 
  • Like
Reactions: KPOM

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Adding to Xiao_Xi insightful comment, leaked Geekbench scores put the i9-13900k in the same league as the M1 Ultra (around 24k multicore).

Given the likely 250+ Watt TDP of the i9-13900k, I'd say Apple Silicon is still way ahead of the competition.

That said, Intel is on a trajectory to close the performance-per-watt gap in the next 2-3 generations if key keep improving at this pace (and if Apple Silicon performance keeps improving around 15-20% every year).
AMD and Intel sacrifice efficiency to get the gaming crown. AMD and Intel claim that their CPUs with 1/4 of their normal power consumption can retain around 60-70% performance.

intel.png


By the way, that quote is from @mr_roboto. He and @leman have explained several times why Cinebench R23 is not a good cross-platform benchmark.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
AMD's Zen 4 is insanely power efficient if they didn't have to push it to the limit due to the lazy competitor.
For example Ryzen 9 7950X achieves 80% of its peak performance at almost 1/3 of the power (65W vs 170W).
It's not about what AMD has to do but what their customers want to do. If you buy a high-end desktop and someone tells you that you can make it 25% faster by using ~100 W more power (which should be at most a ~25% increase), it's a very good trade-off.

When you have an application where performance matters, you should compare system-level performance to system-level power consumption. If a modest increase in power usage gives you a comparable increase in performance, it's probably a good deal, even if you are pushing individual components into the region of diminishing returns.
 

Technerd108

macrumors 68040
Oct 24, 2021
3,061
4,311
This whole idea of efficiency seems to be overblown in one use case-desktops. As long as it is able to be cooled more power use is not a primary concern for individuals but performance is. So if Intel can get more performance than Apple silicon by using more power on the desktop as long as it can be cooled who really cares?

For mobile devices Apple silicon pretty much destroys AMD and Intel. Both AMD and Intel have gotten better and Intel has the most area to be able to improve efficiency but for now Apple silicon has a pretty wide lead being able to deliver great performance and even greater battery life and delivering the same performance while on battery or plugged in.

I think Intel will be in the lead once they can shrink their process node to 7nm or less because of the big little architectural changes they have made to their CPU. The new 12th gen CPU’s are pretty great and deliver better battery life but they are still stuck on a less efficient process node. Once that changes using the same big little architecture they may catch up quickly.

On laptops Apple is king for now. On desktops Intel and AMD are still better and the ability to use a more powerful GPU is also a plus. Interesting times for sure!
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
This whole idea of efficiency seems to be overblown in one use case-desktops. As long as it is able to be cooled more power use is not a primary concern for individuals but performance is. So if Intel can get more performance than Apple silicon by using more power on the desktop as long as it can be cooled who really cares?

What if it can't be cooled?

I've been spoiled by Apple Silicon. I can use Intel laptops but they don't have as good battery life and they get hot for doing just office stuff. I can do video production on a laptop and it stays cool. That's a revelation.

It's also nice to run cool, quiet and efficient in a M1 mini.

My office is unheated and uncooled. In the summertime, running Intel systems raises the temperature to where it's uncomfortable so I run more Apple Silicon and less Intel. It's cooler now so I am running more Intel. Not everyone has air conditioning and some people live in places where it's 80, 90, 100 degrees in the summer.
 
  • Like
Reactions: iPadified

Technerd108

macrumors 68040
Oct 24, 2021
3,061
4,311
What if it can't be cooled?

I've been spoiled by Apple Silicon. I can use Intel laptops but they don't have as good battery life and they get hot for doing just office stuff. I can do video production on a laptop and it stays cool. That's a revelation.

It's also nice to run cool, quiet and efficient in a M1 mini.

My office is unheated and uncooled. In the summertime, running Intel systems raises the temperature to where it's uncomfortable so I run more Apple Silicon and less Intel. It's cooler now so I am running more Intel. Not everyone has air conditioning and some people live in places where it's 80, 90, 100 degrees in the summer.

That is really not relevant to the discussion. Apple silicon gets up ro 100c too. All of these modern cpu run at the same peak temp because of the thermal properties of silicon.

I was specifically talking about desktop. Obviously laptops are where Apple shines as I had said.

I would think if you can afford Mac's you could afford some air conditioning? Any office running computers will do better in a cooled environment.
 
  • Like
Reactions: singhs.apps

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
That is really not relevant to the discussion. Apple silicon gets up ro 100c too. All of these modern cpu run at the same peak temp because of the thermal properties of silicon.

I was specifically talking about desktop. Obviously laptops are where Apple shines as I had said.

I would think if you can afford Mac's you could afford some air conditioning? Any office running computers will do better in a cooled environment.

It would cost a lot more to put in air conditioning than consumer Macs cost. But it would be a waste of money because it would only be needed for three months out of the year. I generally try to keep my systems running at 30% load or less to keep them running cool.

None of my systems, Intel or Apple Silicon, run anywhere close to 100 degrees.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
That is really not relevant to the discussion. Apple silicon gets up ro 100c too. All of these modern cpu run at the same peak temp because of the thermal properties of silicon.

I was specifically talking about desktop. Obviously laptops are where Apple shines as I had said.

I would think if you can afford Mac's you could afford some air conditioning? Any office running computers will do better in a cooled environment.
I agree. I think Macbooks shine with Apple Silicon. I was actually disappointed that Apple didn't push the Mac Studio more.

Perhaps a "High Power" mode could be added for the Studio and Mac Pro that could boost TDP well beyond the current limits.
 

Technerd108

macrumors 68040
Oct 24, 2021
3,061
4,311
It would cost a lot more to put in air conditioning than consumer Macs cost. But it would be a waste of money because it would only be needed for three months out of the year. I generally try to keep my systems running at 30% load or less to keep them running cool.

None of my systems, Intel or Apple Silicon, run anywhere close to 100 degrees.

I quoted peak temperature.

Ice pads and a fan and towel can be very helpful.

I understand air conditioning can be prohibitively expensive. They do have portable indoor air conditioning that could be placed near the devices as needed and don't cost more than a few hundred?
 
  • Haha
Reactions: Argoduck

Unregistered 4U

macrumors G4
Jul 22, 2002
10,610
8,628
Perhaps a "High Power" mode could be added for the Studio and Mac Pro that could boost TDP well beyond the current limits.
That’s assuming it would even work. Well, we know it wouldn’t work for current chips as there would surely be a YouTube video, complete with grimacing face, talking all about it. But, it could be that the entire SoC is limited in ways that Intel wouldn’t be.

Apple Silicon is not like Intel where the CPU stands alone communicating with all the other pieces parts via careful alignment of dip switches. It could be that increasing the clock speed/power envelope far past the current threshold would cause some component, that on other motherboards is separate from the CPU, to not function properly. The speed and voltage of Apple Silicon chips could be more about “dialing in the right balance that can actually be produced in mass quantities” which simply may not lend itself to “juicing it and see what happens”.
 
  • Like
Reactions: mopatops

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
That’s assuming it would even work. Well, we know it wouldn’t work for current chips as there would surely be a YouTube video, complete with grimacing face, talking all about it. But, it could be that the entire SoC is limited in ways that Intel wouldn’t be.
Apple already does this today. The M CPU is overclocked compared to the A14/A15. It also adjusts clock speeds dynamically like any modern CPU.

Apple could, in theory, overclock the CPU even more - well past the efficiency sweet spot. However, cool and quiet has been the priority thus far. Perhaps Apple just hasn't gotten around to optimizing the power curve on the Studio. Maybe we will see the SoCs get pushed more when the Mac Pro comes out.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
I quoted peak temperature.

Ice pads and a fan and towel can be very helpful.

I understand air conditioning can be prohibitively expensive. They do have portable indoor air conditioning that could be placed near the devices as needed and don't cost more than a few hundred?

I've not heard of something more ridiculous. It's far easier to just get hardware that runs cool than it is to get equipment that runs hot and then you use additional power to cool it off. It is nice to have a really quiet environment too.
 
  • Like
Reactions: Argoduck

Technerd108

macrumors 68040
Oct 24, 2021
3,061
4,311
I've not heard of something more ridiculous. It's far easier to just get hardware that runs cool than it is to get equipment that runs hot and then you use additional power to cool it off. It is nice to have a really quiet environment too.


Whatever. You are completely missing the point. But maybe that is the point.

Apple silicon is great but on desktop hardware Intel and AMD are still faster. You can talk fans and thermals all you want but as long as the cou can maintain a higher clock and be faster than Apple silicon the rest is just irrelevant.

Factor in faster GPU with Ray tracing and it is not as much a competition.

Again I love Apple silicon in general and specifically on mobile but you have to call it as it is.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Whatever. You are completely missing the point. But maybe that is the point.

Apple silicon is great but on desktop hardware Intel and AMD are still faster. You can talk fans and thermals all you want but as long as the cou can maintain a higher clock and be faster than Apple silicon the rest is just irrelevant.

Factor in faster GPU with Ray tracing and it is not as much a competition.

Again I love Apple silicon in general and specifically on mobile but you have to call it as it is.
I think your opinion is out of sync with the rest of the world regarding use of electricity. Efficacy is the key driver today. Look at data centers and for that sake crypto currency. There are large economical and good will gains with efficacy in these setup.

So to it is on a the desktop: 500W is far better than 1000W. Considering say 1 million high end PC on the planet that are pushed to the limit, that is 0.5 GW savings.

The relevant question is if Apple can provide lower power draw for the same high end compute load. That is not obvious.

Edit TW>GW early in the morning here :)
 
Last edited:
  • Like
Reactions: Zorori and Conutz

russell_314

macrumors 604
Feb 10, 2019
6,664
10,264
USA
Even non-professionals should know the basic equation for battery life:

battery capacity in watt-hour / workload in watt = battery life in hours (not accounting for LCD, etc.)

Even M1 Macbook Pro has consequences running heavier or full work loads unplugged which leads to about 1.5 hours or less at full CPU+GPU loads. Professionals tend to run plugged for full performance without short battery life. Stable Diffusion works the integrated GPU like gaming.


Forward to 5:20
The thing is even with the same workload and same battery capacity M1 significantly outperforms anything from Intel. There’s no formula where X amount of work uses Y amount of power with any chip. If you’re doing X amount of work on M1, it’s going to use less power than you’re doing X amount of work on Intel.

Of course it’s going to use more power if it’s under load. That’s due to it saving power when idle or under minimal load.
 
  • Like
Reactions: Argoduck

Pressure

macrumors 603
May 30, 2006
5,179
1,544
Denmark
Apple already does this today. The M CPU is overclocked compared to the A14/A15. It also adjusts clock speeds dynamically like any modern CPU.

Apple could, in theory, overclock the CPU even more - well past the efficiency sweet spot. However, cool and quiet has been the priority thus far. Perhaps Apple just hasn't gotten around to optimizing the power curve on the Studio. Maybe we will see the SoCs get pushed more when the Mac Pro comes out.
If anything you should say that they are underclocked but then again it doesn't really matter as the M1/M2 and A14/A15 are totally different SoCs targeting dissimilar form factors and TDPs. Remember that the System Level Cache (SLC) aren't the same over the A-series and M-series of SoCs.
 
Last edited:

senttoschool

macrumors 68030
Nov 2, 2017
2,626
5,482
If anything you should say that they are underclocked but then again it doesn't really matter as the M1/M2 and A14/A15 are totally different SoCs targeting dissimilar form factors and TDPs. Remember that the System Level Cache (SLC) aren't the same over the A-series and M-series of SoCs.
No, the A series is clocked in a way that has the best combination of speed to TDP for the iPhone. It's not underclocked.

What I'm saying is to clock the M CPU so it has the best combination of speed to TDP for the Mac Studio/Mac Pro.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
Whatever. You are completely missing the point. But maybe that is the point.

Apple silicon is great but on desktop hardware Intel and AMD are still faster. You can talk fans and thermals all you want but as long as the cou can maintain a higher clock and be faster than Apple silicon the rest is just irrelevant.

Factor in faster GPU with Ray tracing and it is not as much a competition.

Again I love Apple silicon in general and specifically on mobile but you have to call it as it is.

The vast majority of people will find an M1 or M2 Mac fine for their purposes. There is no need to run a hot CPU today for the vast majority of people. It is ridiculous to think that the average person needs the performance and thermals of an 7750 or 13900. And there are lots of people that don't like the amount of heat that Intel systems put out; even those with AC.
 

maflynn

macrumors Haswell
Original poster
May 3, 2009
73,682
43,740
There is no need to run a hot CPU today for the vast majority of people.
The problem as I see it, both Intel and now AMD, run hot even when not really doing intensive work. I've seen chatter about how intel's (and now AMDs) chips will mean that air cooling will be a thing of the past, and how even AIOs cannot keep up. Mostly hyperbole to be sure, but the underlying concern is valid.
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
The problem as I see it, both Intel and now AMD, run hot even when not really doing intensive work. I've seen chatter about how intel's (and now AMDs) chips will mean that air cooling will be a thing of the past, and how even AIOs cannot keep up. Mostly hyperbole to be sure, but the underlying concern is valid.

My main Windows desktop is a 1i7-10700 (not 10700K) and it normally runs around 30-40 degrees at idle in the summer. It usually runs around 25-30 degrees idle in the winter. One of my workloads will get it into the 40-50 degree range at about a 12 percent CPU load and that's where I found it comfortable to run with. The M1 mini runs the same workload using about 30% CPU load (one of the programs runs very poorly under Rosetta 2). The system does have good air cooling. Older Intel CPUs have a base clock around 3-4 Ghz and Turbo somewhat faster. That doesn't seem to bad on power consumption and thermals for a desktop.

Are you saying that they are going to run hotter with higher power consumption at idle? Or just light loads?
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
The problem as I see it, both Intel and now AMD, run hot even when not really doing intensive work. I've seen chatter about how intel's (and now AMDs) chips will mean that air cooling will be a thing of the past, and how even AIOs cannot keep up. Mostly hyperbole to be sure, but the underlying concern is valid.
I dunno, I've seen that these chips are being allowed to run "full tilt" and if you pull power they don't run near as hot as is shown. AMD also made a mistake in allowing AM4 coolers to work with AM5. Der8auer has shown that delidding a 7900X allows for a 20C decrease in temps, meaning the IHS is too thick to wick away heat sufficiently.
 

maflynn

macrumors Haswell
Original poster
May 3, 2009
73,682
43,740
Are you saying that they are going to run hotter with higher power consumption at idle? Or just light loads?
Early reports seem to show the 7950x running at a near constant 95c

My main Windows desktop is a 1i7-10700 (not 10700K) and it normally runs around 30-40
Mine runs in the 40c range at idle (I7-11700k) but it seems both intel and AMD are doing everything they can to squeeze out every bit of performance, and they're including boosting the wattage.

Apple has done a fantastic job at producing a lot of performance from its ARM architecture but AMD with its 3d cache design, chiplets, etc shows a level of innovation that Apple has to keep up. I'm of the opinion just making the on die cpu larger like they've been doing can only go so far, there needs to be other improvements if Apple is to keep up.

One potential issue is focus, it seems any CPU design, will be for their entire product line, and then customized for the product, i.e., basically the Ax and Mx shares a common design. So they could be constrained there, or they can't focus too much on GPU performance because they have engineers working on power efficiency. What I'm trying to get at, Intel and AMD have completely different divisions working on CPUS and GPU, I don't think Apple does and they're further constrained by silicon real estate.

Apple has a lot of advantages with designing their own processors, but that can turn into disadvantages as Intel and AMD keep pushing the envelope.

I do wonder if history is about to repeat itself with regards to the PPC (with motorola and ibm failing to keep up with intel).
 

pshufd

macrumors G4
Oct 24, 2013
10,146
14,573
New Hampshire
Early reports seem to show the 7950x running at a near constant 95c


Mine runs in the 40c range at idle (I7-11700k) but it seems both intel and AMD are doing everything they can to squeeze out every bit of performance, and they're including boosting the wattage.

Apple has done a fantastic job at producing a lot of performance from its ARM architecture but AMD with its 3d cache design, chiplets, etc shows a level of innovation that Apple has to keep up. I'm of the opinion just making the on die cpu larger like they've been doing can only go so far, there needs to be other improvements if Apple is to keep up.

One potential issue is focus, it seems any CPU design, will be for their entire product line, and then customized for the product, i.e., basically the Ax and Mx shares a common design. So they could be constrained there, or they can't focus too much on GPU performance because they have engineers working on power efficiency. What I'm trying to get at, Intel and AMD have completely different divisions working on CPUS and GPU, I don't think Apple does and they're further constrained by silicon real estate.

Apple has a lot of advantages with designing their own processors, but that can turn into disadvantages as Intel and AMD keep pushing the envelope.

I do wonder if history is about to repeat itself with regards to the PPC (with motorola and ibm failing to keep up with intel).

I have a PowerMac G5 in my basement but I haven't used it in over a year. It's mainly a stand to store cases of tennis balls right now. I had to vacuum the thing out regularly or the fans went nuts.

I'm typing this on a 2014 iMac 27 and there's a 2010 iMac 27 next to it. These old Intel iMacs are still quite usable and the thermals are fine for office stuff. This system was formerly used for commercial video production before I bought it so I could use it for production but I like a quiet desk so I do that on an M1 mini.

I assume that Intel will make 65 Watt versions of Raptor Lake desktops and then lower power mobile parts and even their 45 watt parts should be quite usable for most people. Perhaps AMD will do something similar. Apple definitely has issues with GPU performance for those who need it but I think that the ability to do 4k video production on their low-end computers without using a lot of power is really amazing. We take it for granted now but it feels like a revolution to me. If Apple can do custom silicon for applications that Mac users do a lot, then I can see that as an area of innovation for them. Even moreso if this can't be done easily with the general purpose CPUs from Intel and AMD.

The GPU in my Windows desktop is a GTX 1050 Ti which is a 75 watt card. It actually used 30-40 watts in my normal daily use driving 3 4k monitors so it was a lot more than I needed. It would be more than enough for the typical consumer and this is a card from around 2016 I think. Current cards are really expensive in comparison to what I paid for it though you'd expect them to be dirt cheap given when they were developed. nVidia and AMD want to keep the paradigm that GPUs are expensive parts. I do not know who buys the high-end 30xx and 40xx stuff but I'd guess that sales are not great.

This does give me an idea on doing a video on shopping for old Macs as it's a popular topic on Reddit and even MacRumors. The typical consumer doesn't really know what they need to run applications well and walking into a store will result in a salesperson selling the most expensive computer that they think that they would be willing to pay.

Someone on Reddit bought a Mac Studio to run a program that would run fine on a five-year-old iMac - because Rosetta 2 performance is so bad with this application.

nVidia, Intel and AMD have lost half to two-thirds of their value from the highs and consumers globally are getting pinched by a global rising rate environment.


sc-1.png



sc-2.png




sc.png
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.