Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

BanjoDudeAhoy

macrumors 6502a
Aug 3, 2020
921
1,624
Everywhere you look companies are trying to and households are pushed to get more energy efficient, more environmentally friendly and sustainable… and then there’s Intel and nVidia announcing things like this or the GeForce 4090 series.
But hey, at least you’ll be gaming on Ultra settings, I guess. For about half a year when the next best thing comes out.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Electricity was supposed to be plentiful & cheap remember? https://emp.lbl.gov/news/berkeley-lab-study-investigates-how-plentiful

It's not about electricity costs, it's about basic common sense and responsible use of resources. A personal computer shouldn't users much power as a cooking appliance. Of course, there are always individuals and organisations that require more computational resources, and they should definitely have them, but running a CPU this far outside the optimal performance/power segment on the curve is not the way to go. That's what massive multicore platforms like Xeons and Threadrippers are for.
 

pastrychef

macrumors 601
Sep 15, 2006
4,754
1,453
New York City, NY
My previous daily driver was an i9-9900K based hackintosh and, several systems before that, I used a dual Xeon MacPro5,1. Earlier this year, I switched to an M1 Max Mac Studio.

Since switching to the Mac Studio, I noticed that the air conditioner cools my studio apartment much better at lower settings than in the past. This is despite us having a brutally hot summer. I'm convinced Apple's approach is the right one and Intel's is a complete embarrassment.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I find these developments both embarrassing and concerning. This power consumption inflation has to stop. This is no innovation.
Is it really inflation? Most boards wont support this mode, and for those that do they are gonna be expensive. Plus this isn't for air cooling, and I suspect a lot of AIO's to not be able to support the cooling requirements of this mode. It really appears geared towards folks that are serious about overclocking.

I can see this mode + LN2 crushing the current 12900ks 7.8Ghz record.

While I think it would be cool for Apple to dabble in this realm, I know they wont. They never have and no one has complained about it yet.
 

rme

macrumors 6502
Jul 19, 2008
323
496
It's not about electricity costs, it's about basic common sense and responsible use of resources. A personal computer shouldn't users much power as a cooking appliance. Of course, there are always individuals and organisations that require more computational resources, and they should definitely have them, but running a CPU this far outside the optimal performance/power segment on the curve is not the way to go. That's what massive multicore platforms like Xeons and Threadrippers are for.
I get what you mean, and I'm the same way. I appreciate efficiency. But not everyone is like that, and if a resource is plentiful & cheap...why not? 20 years ago many people thought metered internet would be the norm. ;-)
 
Last edited:

GuruZac

macrumors 68040
Sep 9, 2015
3,748
11,733
⛰️🏕️🏔️
This makes me appreciate the snappiness, battery life, and instant-on of my M1 MacBook Air even more while it sips power, never gets warm no matter what I throw at it, and when I restart it, it’s back on in under 10 seconds.
 
Last edited:
  • Like
Reactions: KeithBN

leman

macrumors Core
Oct 14, 2008
19,521
19,675
Is it really inflation?

I believe so, yes. Of course, the 350W TDP mode is a marketing gimmick and you'd need some very special hardware to even hope to utilise it. But we also had a noticeable power creep in both laptops and desktops over the last few years.

I get what you mean, and I'm the same way. I appreciate efficiency. But not everyone is like that, and if a resource is plentiful & cheap...why not? 20 years ago many people thought metered internet would be the norm. ;-)

Why not? Because it promotes lazy, wasteful engineering. And I am worried about the lasting effect it might have, as R&D resources are diverted from smart solutions and genuine improvements towards quick bandaids. Not to mention that pushing the power consumption as primary way to increase performance does not scale well. One is going to hit the limit — soon, at which point all of this stuff just comes crashing down quickly.
 

Bug-Creator

macrumors 68000
May 30, 2011
1,783
4,717
Germany
But we also had a noticeable power creep in both laptops and desktops over the last few years.

Make that decades (with a few bumps and plateaus).

I still remember the early 486 needing a heatsink (386 and below were mostly still running naked) and the DX2/4 needing a small fan. 400W PSU was considered big and plenty only needed when you planned to fill up a big tower with drives and cards.
 

pshufd

macrumors G4
Original poster
Oct 24, 2013
10,146
14,573
New Hampshire
Oh right. 10nm for alder lake. I guess the halcyon days of 14nm are over.

I have never had a bad experience with intel (skipped pentium 4 completely, lol)… all my intel systems have been reliable and good performers. But their total stagnation (how the heck was a 4790k still competitive for 4+ product cycles?) is just arrogance.

I hope they get their **** together because AMD have demonstrated they’re not the good guys either, just look at the fiasco of zen3 on 300 series chipsets. Strong competition will benefit the consumer. But this design approach, if true shows they are kind of rudderless. Hopefully their r&d people are working on some good stuff.


Tbh I’m ready to see the return of Power in mainstream skus so we can have all out silicon war again.

I have a 2014 iMac next to my M1 mini on my desk. The 2014 iMac is still great for office stuff but the thermals on the 2014 iMac compared to the M1 mini - it's just insane as to the difference. I will continue to use the iMac as I love the screen, videocam, speakers and microphones and it is quite usable - I run CPU-intensive stuff on the mini. I'm just careful to try to avoid spinning up the fans on the iMac. The 2020 would be better as spreading workloads over more cores would be more efficient but the prices on 2014 iMacs is pretty attractive just for the screen alone.
 

rme

macrumors 6502
Jul 19, 2008
323
496
Why not? Because it promotes lazy, wasteful engineering. And I am worried about the lasting effect it might have, as R&D resources are diverted from smart solutions and genuine improvements towards quick bandaids. Not to mention that pushing the power consumption as primary way to increase performance does not scale well. One is going to hit the limit — soon, at which point all of this stuff just comes crashing down quickly.
I agree it's lazy but also cheaper for Intel. Increasing efficiency would require more resources. Pentium 4 all over again.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I believe so, yes. Of course, the 350W TDP mode is a marketing gimmick and you'd need some very special hardware to even hope to utilise it. But we also had a noticeable power creep in both laptops and desktops over the last few years.



Why not? Because it promotes lazy, wasteful engineering. And I am worried about the lasting effect it might have, as R&D resources are diverted from smart solutions and genuine improvements towards quick bandaids. Not to mention that pushing the power consumption as primary way to increase performance does not scale well. One is going to hit the limit — soon, at which point all of this stuff just comes crashing down quickly.
Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.
 
  • Like
Reactions: RuralJuror

pastrychef

macrumors 601
Sep 15, 2006
4,754
1,453
New York City, NY
Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.

Well, Intel's mobile chips can't keep up with Apple Silicon and it spurs posts like the following...

 

pshufd

macrumors G4
Original poster
Oct 24, 2013
10,146
14,573
New Hampshire
Intel last earnings report was dismal and maybe this gets them some bragging rights and a few sales but it seems that AMD has momentum in traditional Intel markets and Apple is eating their lunch in efficiency. It will be interesting to see if and when Microsoft is really ready to go on ARM. I have not tried out WARM yet myself but it seems to be making pretty good progress. It may be time to give it a try.
 
  • Wow
Reactions: diamond.g

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
  • Like
Reactions: pastrychef

pshufd

macrumors G4
Original poster
Oct 24, 2013
10,146
14,573
New Hampshire
Yeah, it looks like for Intel to increase single core performance they have to increase clock speed, which means they have to up power draw if they cannot reduce the node the die is made on.

I've read that they are going to make a mobile line with only efficiency cores. It should be interesting to see what the efficiency is like. There is still one program that I like to use from Windows and Ventura is going to include it so I should have all that I want on macOS when Ventura comes out. I haven't tried out Ventura yet but I'm tempted to try it out on a virtual machine on my 2014 iMac - I'm not sure if that will work as Ventura requires AVX2 I think on Intel CPUs and not sure if those instructions are emulated in VirtualBox. I tried to run a virtual machine with Ventura on my M1 Pro MacBook Pro but it wouldn't launch.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,675
I still remember the early 486 needing a heatsink (386 and below were mostly still running naked) and the DX2/4 needing a small fan. 400W PSU was considered big and plenty only needed when you planned to fill up a big tower with drives and cards.

Some power consumption creep is expected as the technology matures. For example, early GPUs only consumed 40-50W max, but with the advent of gaming and later GPgPU there was a need for bigger and more powerful devices. But there have to be reasonable limits to everything. I think it’s entirely unreasonable for a mid-range GPU to draw over 300W, but that’s the future we are headed towards.


Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.

Mobile high end Intel chip these days are advertised as 55W TDP, but you actually need 60W+ for them to operate at declared frequencies. Not to mention that they will often operate at much higher wattages. I mean, just check out the tests notebookcheck did on CPU efficiency: they had a 1240P record a 60W in Cinebench multicore. That’s twice over the nominal TDP limit, and one can’t argue that it’s a short-term boost limit either - that benchmark takes more than a few seconds to run. The 12700H averaged 70W - still 25W higher than the nominal TDP. And sure, one can argue that it’s the laptop vendors who set the limits higher but it only shows how widespread this practice became. Just a couple of years ago a 45W Intel CPU running multi core CB averaged, well, 45W, as expected. But recently the TDP became entirely meaningless, and it’s not that different for CPU clocks either.

 

pshufd

macrumors G4
Original poster
Oct 24, 2013
10,146
14,573
New Hampshire
Mobile high end Intel chip these days are advertised as 55W TDP, but you actually need 60W+ for them to operate at declared frequencies. Not to mention that they will often operate at much higher wattages. I mean, just check out the tests notebookcheck did on CPU efficiency: they had a 1240P record a 60W in Cinebench multicore. That’s twice over the nominal TDP limit, and one can’t argue that it’s a short-term boost limit either - that benchmark takes more than a few seconds to run. The 12700H averaged 70W - still 25W higher than the nominal TDP. And sure, one can argue that it’s the laptop vendors who set the limits higher but it only shows how widespread this practice became. Just a couple of years ago a 45W Intel CPU running multi core CB averaged, well, 45W, as expected. But recently the TDP became entirely meaningless, and it’s not that different for CPU clocks either.


Not surprising. I have a 65 Watt i7-10700 in my desktop but I've seen it using 90 Watts. Apparently PL2 is 105 Watts so it can go higher. What is crazy is that the power consumption ramps really fast while not even doing that much work. I have wondered if Intel's mobile processors operate the same way. 90 watts would be a ton of power on an ultrabook. Maybe PL2 is a lot lower for mobile processors?

I use a 2015 and 2021 MacBook Pros and the performance of the 2015 is actually fine for most of the things that I do. The most noticeable differences are heat, fans and battery life.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,664
OBX
I've read that they are going to make a mobile line with only efficiency cores. It should be interesting to see what the efficiency is like. There is still one program that I like to use from Windows and Ventura is going to include it so I should have all that I want on macOS when Ventura comes out. I haven't tried out Ventura yet but I'm tempted to try it out on a virtual machine on my 2014 iMac - I'm not sure if that will work as Ventura requires AVX2 I think on Intel CPUs and not sure if those instructions are emulated in VirtualBox. I tried to run a virtual machine with Ventura on my M1 Pro MacBook Pro but it wouldn't launch.
We like to call those processors Intel Atom...
Mobile high end Intel chip these days are advertised as 55W TDP, but you actually need 60W+ for them to operate at declared frequencies. Not to mention that they will often operate at much higher wattages. I mean, just check out the tests notebookcheck did on CPU efficiency: they had a 1240P record a 60W in Cinebench multicore. That’s twice over the nominal TDP limit, and one can’t argue that it’s a short-term boost limit either - that benchmark takes more than a few seconds to run. The 12700H averaged 70W - still 25W higher than the nominal TDP. And sure, one can argue that it’s the laptop vendors who set the limits higher but it only shows how widespread this practice became. Just a couple of years ago a 45W Intel CPU running multi core CB averaged, well, 45W, as expected. But recently the TDP became entirely meaningless, and it’s not that different for CPU clocks either.

For whatever reason Intel has stopped including base frequency with the 12th gen CPU's so it is hard to say what the frequency should be for the "base" power draw. They do admit really high draw for Max Turbo though, so it isn't like they are lying. Vendors don't have a whole lot of choice in reigning in the power draw due to competition (why buy the Lenovo if the Dell can boost to higher frequencies, etc).

Really it is smart of Apple to not really talk about clocks as it neatly sidesteps such comparisons.
 
  • Like
Reactions: VivienM

PsykX

macrumors 68030
Sep 16, 2006
2,744
3,919
I think Apple is pushing too much for thermals.

For laptops : It's absolutely phenomenal and they have to keep doing it.
For Desktops : e.g. iMac, Mac Studio, Mac Pro, it's OK but also a bit irrelevant at the same time. They should overclock their processors a bit and care a little bit less about thermals.
 
  • Haha
Reactions: AlexMac89
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.