Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

zarathu

macrumors 6502a
May 14, 2003
652
362
I have yet to get my M1Pro’s fans even to come on. Running Handbrake for 45 minutes at 100% of all 10 processors, I still could not hear the fan.
 

januarydrive7

macrumors 6502a
Oct 23, 2020
537
578
What causes capacitors to fail and pop? Excessive heat so it does play a role while you claim it doesn't.
Trying to get out of that hole you dug for yourself?

As an anti-Apple evangelist, I'm sure you're familiar with Louis Rossmann, who specialized in logic board repair for Apple laptops. It's pretty clear, at least for Apple's laptops, that 99% of cap failures are due to shorts caused by liquid damage.

I'll let mr_roboto's previous comment answer your apparent gotcha further:
Thanks to heatpipes and other modern cooling technologies, heat can be removed from high power silicon and transferred far away without putting too much thermal energy into other nearby components.
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
Explain how heat pipe cool nearby capacitors?

Go back to where he said "electrolytic" which is different from polymer until he took the afternoon to Google search the difference.
 

kissmyasthma

macrumors newbie
Nov 1, 2021
5
1
Charlotte, NC
This is what I sit at all day long, every day with my 16 inch MacBook Pro M1 Max, and I don't care. I trust Apple and their engineers enough that I don't lose sleep over this kind of stuff. If there's something wrong with the temps, the system should take care of it. If there is some catastrophic failure for some reason, that's why I have a Time Machine backup and AppleCare on the device.

I have the settings to System Controlled and notice my temps and fans are still at zero RPM. It just works! I've heard the fans kick on less than 10 times that I know of. Don't know how they do it, but I'm still continually amazed at the power and capabilities of these new chipsets.

1646436372836.png
 

wilberforce

macrumors 68030
Aug 15, 2020
2,930
3,207
SF Bay Area
This is what I sit at all day long, every day with my 16 inch MacBook Pro M1 Max, and I don't care. I trust Apple and their engineers enough that I don't lose sleep over this kind of stuff. If there's something wrong with the temps, the system should take care of it. If there is some catastrophic failure for some reason, that's why I have a Time Machine backup and AppleCare on the device.

I have the settings to System Controlled and notice my temps and fans are still at zero RPM. It just works! I've heard the fans kick on less than 10 times that I know of. Don't know how they do it, but I'm still continually amazed at the power and capabilities of these new chipsets.

View attachment 1967966
Presumably those are degrees Fahrenheit, in which case it is barely warm.
If they are Celsius you would be having a meltdown
 
  • Like
Reactions: mi7chy and jdb8167

mr_roboto

macrumors 6502a
Sep 30, 2020
856
1,866
Explain how heat pipe cool nearby capacitors?

Go back to where he said "electrolytic" which is different from polymer until he took the afternoon to Google search the difference.
LMAO.


Wikipedia: "A polymer capacitor, or more accurately a polymer electrolytic capacitor, is an electrolytic capacitor (e-cap) with a solid conductive polymer electrolyte."

Back to me: as a rule, you can assume that any capacitor which has a polarity is a type of electrolytic capacitor. The electrolyte can be a liquid, as in old-school cylindrical aluminum electrolytics. Or it can be a solid, as in polymer capacitors. There's a huge number of different materials that have been used as the electrolyte in an electrolytic cap, and a huge array of different sizes and shapes of devices.

Do you get it yet, or do we need to go through yet another round? You seized on that one word "electrolytic" and tried to use it to make me look ignorant, but all along you've been doing nothing but exposing how ignorant you are.

Also, what a heat pipe does is provide a very low thermal resistance path between a heat source and a radiator. It doesn't directly cool other components like caps, it indirectly cools them by not having them get as hot in the first place. Without the heat pipe to channel most of the SoC's thermal energy output into the radiator, it would spread out and heat up stuff like the capacitors quite a bit more.
 
  • Like
Reactions: Sikh

Sikh

macrumors 6502a
Mar 8, 2011
548
320
I mean yeah, I just imagine these people acting this way If I brought this up in conversation in person. I'd be baffled. I guess it all depends on tone of voice too, which is hard here. Anyway, I come from the PC world and 100C is just insane. But if Apple says its okay to run it that hot, I guess that's okay...I wouldn't want to wear out the fans for no reason.

By PC world you mean Intel and AMD right? Because even Microsoft's own SQx Processors are ARM based and the temperature for that CPU will be different than Intel / AMD.

You are making assumptions which is why the first page of replies (this is how far ive made it on this thread) are the way they are.

You dont know the operating temp for Apple's M1 chips because I dont think they have published them. I did see you can push them to 110-115 before they start throttling based on some testing someone did with probes and stuff.

Either way, Anything you know about "traditional" processors I would ignore when it comes to ARM based CPUs
 
  • Haha
Reactions: oz_rkie

TimmuJapan

macrumors 6502
Jul 7, 2020
373
651
Here's a faulty assumption you've made: that the CPU being at a high die junction temperature necessarily means the rest of the system's components must also become as hot. Thanks to heatpipes and other modern cooling technologies, heat can be removed from high power silicon and transferred far away without putting too much thermal energy into other nearby components.


How do you know the data's actually on your side? Did you do component level failure analysis to figure out what actually failed in your devices, and why, or did you just assume it was temperature because you're predisposed to think it's always temperature? Did you collect a much larger data sample since one person's devices aren't enough to be considered valid statistical data?
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way. ;)
 
Last edited:
  • Haha
  • Like
Reactions: Sikh and oz_rkie

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
Why would you keep changing settings? You find the best OC at the best power and fan curve for your situation and call it a day.


That's a weird way to look at it. I have a high end gaming desktop that I spent a while to initially setup, dial in a good OC at a good voltage curve and tune the fans (maybe an hour or so to setup and a couple hours of prime 95 to validate) and then run it 24x7 with a good 5-10% free performance ALL the time. My friend who also has a gaming desktop does not know about overclocking or tuning and does not care about it which is completely fine. He does not bother ever going into the BIOS or installing and using intel XTU. He can still use his computer just fine for everything that he needs it for. But those avenues are still there for those that want to make use of them.

Its easy enough to hide 'advanced' options tucked away in the UI so casual users don't even see these options.

I think we obviously have different approaches to our hardware. I'd expect that, that's why there's options out there. But, while I understand that you're happy there are systems that let you tweak all the different parameters to see what you can squeeze out, I'd ask you to understand that there are people like me who want to keep things as simple as possible.

Maybe you don't remember the days of having to set your own IRQs and then discovering months later that all those frustrating crashes were because you introduced a subtle incompatibility. Or the days when MacOS would start and people would have a dozen system extensions that left everything configurable, but far less stable. I'm happy to leave that behind and just use a very good computer at very nearly optimal performance.

To me, 5-10% performance gains will be outdone by technology improvements in 6 months or so.

What does 5% better on a gaming desktop even mean? Instead of running at 60fps, it runs at 63fps? It sounds like bragging rights more than actual value for the effort. There's not anything wrong with that-- people have been hot rodding cars for decades because they enjoy it. But for daily driving, I don't want to have to search for my basic controls among a bunch of options I, and most people, don't care about. There are already way too many options in System Preferences as it is.

My computer is a tool. I buy it, I use it, eventually I replace it. I don't have the time or interest to spend hours polishing it to get 5% extra performance. Honestly, if a job takes an hour to run, that saves me 3 minutes. Am I monitoring my machine carefully enough for the 57 minutes it's busy to do anything useful with it in those 3 minutes? No. I'm on another machine doing something productive until it's done, I'm getting coffee while I wait, I'm sleeping while a batch runs.

If the job is short enough that I'm willing to sit and watch the progress bar, then 5-10% saves me so little time it's essentially imperceptible.

In a few years I'll get a machine that's 50% faster or more, often much more on specialized tasks. That makes a difference in my workflow in the sense that it changes what I can do while I wait versus what I do in the background, 5-10% doesn't do that.

And I'm certainly not willing to jeopardize reliability in the short term (kernel panic) or long term (failed components) to gain those few percent.
 
  • Like
Reactions: Sikh

oz_rkie

macrumors regular
Apr 16, 2021
177
165
I think we obviously have different approaches to our hardware. I'd expect that, that's why there's options out there. But, while I understand that you're happy there are systems that let you tweak all the different parameters to see what you can squeeze out, I'd ask you to understand that there are people like me who want to keep things as simple as possible.

Maybe you don't remember the days of having to set your own IRQs and then discovering months later that all those frustrating crashes were because you introduced a subtle incompatibility. Or the days when MacOS would start and people would have a dozen system extensions that left everything configurable, but far less stable. I'm happy to leave that behind and just use a very good computer at very nearly optimal performance.

To me, 5-10% performance gains will be outdone by technology improvements in 6 months or so.

What does 5% better on a gaming desktop even mean? Instead of running at 60fps, it runs at 63fps? It sounds like bragging rights more than actual value for the effort. There's not anything wrong with that-- people have been hot rodding cars for decades because they enjoy it. But for daily driving, I don't want to have to search for my basic controls among a bunch of options I, and most people, don't care about. There are already way too many options in System Preferences as it is.

My computer is a tool. I buy it, I use it, eventually I replace it. I don't have the time or interest to spend hours polishing it to get 5% extra performance. Honestly, if a job takes an hour to run, that saves me 3 minutes. Am I monitoring my machine carefully enough for the 57 minutes it's busy to do anything useful with it in those 3 minutes? No. I'm on another machine doing something productive until it's done, I'm getting coffee while I wait, I'm sleeping while a batch runs.

If the job is short enough that I'm willing to sit and watch the progress bar, then 5-10% saves me so little time it's essentially imperceptible.

In a few years I'll get a machine that's 50% faster or more, often much more on specialized tasks. That makes a difference in my workflow in the sense that it changes what I can do while I wait versus what I do in the background, 5-10% doesn't do that.

And I'm certainly not willing to jeopardize reliability in the short term (kernel panic) or long term (failed components) to gain those few percent.
I don't think you are understanding what I am saying or perhaps purposefully ignoring it? I am not saying that every user should be tinkering or overclocking or what not. I was responding to some absurd points you were making.

Like this one
Having them does hurt me. It makes it harder to find the few things I actually care about. It makes it more likely that I forget to change them back to a different mode under different conditions.
I had already explained using BIOS as just one example how it is completely possible to let advanced users experiment if they want to and more casual or uninterested users completely ignore. Feel free to use your computer at stock settings, that is perfectly fine, but saying stuff like having hidden away and completely ignorable advanced options hurts you just by existing is laughable.

And I'm certainly not willing to jeopardize reliability in the short term (kernel panic) or long term (failed components) to gain those few percent.
Then don't. You don't have to OC or change settings you don't want to just because they are there.

And I'm not sure that the people who make use from them actually benefit from them... As here, often it's people playing with things they don't understand and parroting what they see on YouTube or Reddit.
And stuff like this where you pull anecdotal evidence out of your behind. What an absurd thing to say. How do you know that people who overclock don't benefit from it. Having a 5-10% (in some cases even more depending on your setup) of free performance ALL the time is enormously useful for many use cases. Stuff like using your machine as a build server compiling code all the time, dedicated video encoding machines, training AI models, just as few examples, essentially any sustained workloads that can fully load your cpu or gpu cores or benefit from memory throughput will benefit from overclocking.

Having OC options available also have the benefit of undervolting, i.e. if you have a really good quality (binned) chip you could even undervolt it for the same amount of performance for even less power.

In any case, like I've mentioned, there's nothing wrong with using your system stock because you either don't have the time or the inclination to tinker, or for your use case that 5% is not worth it, or its too hard for you, whatever, its completely fine, nothing wrong with it. But saying stuff like 'Having them hurts me' or 'people who overclock don't really understand what they are doing and are just parroting' only makes you look like a fool.
 
  • Haha
Reactions: Sikh

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294

Search for "polymer electrolytic capacitor" on Digikey and you get nothing

https://www.digikey.com/en/products...BwPYBsBPOAUyUxPxIGM0kDC0BLazalbFaptXJEALoBfIA

You have to search for "polymer capacitor" and under type it even says polymer.

https://www.digikey.com/en/products...YFcBOBDALigBABwPYBsBPOAUyUwGMVsUKBLNXJEAXQF8g

If you don't think temperature plays a role open your eyes and look at "lifetime @ temp" column. Majority are 1000 hours @ 105C. What do you think happens at >105C? It shortens lifetime or pops. Basic stuff even a hobbyist knows.
 

oz_rkie

macrumors regular
Apr 16, 2021
177
165
Search for "polymer electrolytic capacitor" on Digikey and you get nothing

https://www.digikey.com/en/products...BwPYBsBPOAUyUxPxIGM0kDC0BLazalbFaptXJEALoBfIA

You have to search for "polymer capacitor" and under type it even says polymer.

https://www.digikey.com/en/products...YFcBOBDALigBABwPYBsBPOAUyUwGMVsUKBLNXJEAXQF8g

If you don't think temperature plays a role open your eyes and look at "lifetime @ temp" column. Majority are 1000 hours @ 105C. What do you think happens at >105C? It shortens lifetime or pops. Basic stuff even a hobbyist knows.
Be careful, he will throw more wikipedia links at you :p . Apparently the goto resource for Engineers.
 
  • Haha
Reactions: mi7chy

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
Jebus, this place has gotten toxic...

I don't think you are understanding what I am saying or perhaps purposefully ignoring it?

Then you go on to completely ignore what I'm saying... Not to mention that you keep talking about overclocking when the OP is clearly concerned about overheating and long term reliability.


But I do have to ask about this, though:

Stuff like using your machine as a build server compiling code all the time, dedicated video encoding machines, training AI models, just as few examples, essentially any sustained workloads that can fully load your cpu or gpu cores or benefit from memory throughput will benefit from overclocking.
They will benefit from a higher clock, but I remain unconvinced that anyone seriously pursuing those workloads benefit from overclocking. Do you imagine this is ever done in a production environment? By which I mean somewhere with the resources, capital and labor, to invest in purchasing or developing the right tools.

If you can point me to an article somewhere where a non-hobbyist is using overclocking in a serious way, for more than a YouTube headline, I'd be interested to see what they've found. Is Amazon tuning up the machines in their server rooms to maximize their fully loaded CPUs and GPUs? Is Microsoft setting BIOS parameters and stability testing their build machines? Pixar? They buy a ton of hardware, so I'd imagine if it was beneficial at the granular level then that benefit would scale up.

Or are they all fools too?
 
  • Like
Reactions: Sikh

oz_rkie

macrumors regular
Apr 16, 2021
177
165
Jebus, this place has gotten toxic...



Then you go on to completely ignore what I'm saying... Not to mention that you keep talking about overclocking when the OP is clearly concerned about overheating and long term reliability.


But I do have to ask about this, though:


They will benefit from a higher clock, but I remain unconvinced that anyone seriously pursuing those workloads benefit from overclocking. Do you imagine this is ever done in a production environment? By which I mean somewhere with the resources, capital and labor, to invest in purchasing or developing the right tools.

If you can point me to an article somewhere where a non-hobbyist is using overclocking in a serious way, for more than a YouTube headline, I'd be interested to see what they've found. Is Amazon tuning up the machines in their server rooms to maximize their fully loaded CPUs and GPUs? Is Microsoft setting BIOS parameters and stability testing their build machines? Pixar? They buy a ton of hardware, so I'd imagine if it was beneficial at the granular level then that benefit would scale up.

Or are they all fools too?
Lol, I am glad you mention Microsoft and Amazon. If you seriously think that Azure and AWS just plop their hardware into cases and turn it on without spending massive effort on tuning and optimization then I don't know what to say. Big companies have been doing things like spending millions into RnD on optimized cooling, tuning and the whole 9 yards to squeeze more from their servers and they do it on scale.

Microsoft literally has research specifically into cost effective overclocking - https://www.microsoft.com/en-us/research/uploads/prod/2021/04/Zissou-Overclocking-ISCA21.pdf

Here are just few of the many resources you can easily find on the topic (either already deployed solutions or top tech companies like Microsoft doing serious RnD)


Some quotes from this article -
Gamers have long used the combination of overclocking CPUs and water cooling to squeeze peak performance out of PCs. Can that same approach create more powerful cloud computing platforms?
New research from Microsoft suggests that it might. The company has been test-driving the use of overclocked processors running in immersion cooling tanks, and says the combination allows servers to perform at a higher level.
“Based on our tests, we’ve found that for some chipsets, the performance can increase by 20 percent through the use of liquid cooling,” said Christian Belady, distinguished engineer and vice president of Microsoft’s datacenter advanced development group. “This demonstrates how liquid cooling can be used not only to support our sustainability goals to reduce and eventually eliminate water used for cooling in datacenters, but also generate more performant chips operating at warmer coolant temperatures for advanced AI and ML workloads.”
This also goes to show that for many modern chips, the cooler it is, potentially more performance can be had from it. Sure a chip can run fine at 100C, but cooling it down a bit more can give you more boost clocks and thus more performance (not talking specifically about m1 here but in general with modern chips, thermal headroom can literally mean more performance a lot of the time)
Glad that you mention tuning as well, because as I mentioned in multiple posts, OC tools are not always about overclocking, they are also about tuning. Not only do the major compute providers invest heavily to do their own tuning but they also expose ways for the end user to do hardware level tuning themselves. Here are some AWS docs that detail how their EC2 instances allow for fine tuned and granular tuning of both the cpu and gpu to suit your workload.

Another non-hobbyist serious scenario are big crypto farms that have millions worth of GPUs running with overclocked memory and finely tuned ASICS with full time staffed responsible for it.

I guess all of these are not serious enough things. You have to realize that just because you might not be aware of something, it does not mean that thing does not exist. The world is bigger than your home office :p
 
Last edited:
  • Haha
Reactions: Sikh

leman

macrumors Core
Oct 14, 2008
19,521
19,677
What causes capacitors to fail and pop? Excessive heat so it does play a role while you claim it doesn't.

Good than that M1 series produces 1/3 of the heat of comparable x86 machines. If you are talking about temperature instead, most capacitors are rated up to 105C or even 125C.
 
  • Like
Reactions: Sikh

januarydrive7

macrumors 6502a
Oct 23, 2020
537
578
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way. ;)
My 2012 15" MBP that runs at 100C day after day is still running strong. ?‍♂️
 
  • Like
Reactions: Sikh

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way. ;)

I spent almost a decade being professionally responsible for IT of a mid-size university department. We almost exclusively use Mac laptops. I am talking about folks who run heavy duty statistical simulations in their laptops overnight and sometimes for days at a time. With all this, I don’t recall a single hardware failure that could be linked to heat. I still have my old 2015 MBP work lying around that has been pushed should all kind of crap, including prolonged gaming sessions, and it still works well. Of course, I don’t use it anymore as newer laptops are faster.

Your computer will be long dead of unrelated causes before temperature-induced damage becomes a factor. Laptops have less thst 50% chance getting to that 10 year mark you mention the first place. If you want your computer to be working after a decade, all you need is luck - your computing behavior plays only a secondary role here.
 
  • Like
Reactions: Queen6 and Sikh

venom600

macrumors 65816
Mar 23, 2003
1,310
1,169
Los Angeles, CA
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way. ;)

You'd have a point if we didn't have 16 years of Macbooks running at high temperatures approaching or at 100c day after day. If what you're saying is true you'd see a point where older Macbooks start failing in serious numbers due to heat related issues. Only, you don't, with the exception of specific failures like the bad GPUs in the 2009, 2011 models or the bad SSDs in the 2015 Macbook. Instead you have a bunch of armchair electrical engineers who mistakenly assume that they know more than the people who designed them because they read something on a forum or a tech youtuber said something alarmist, or other reasons. You're looking for a problem that doesn't seem to exist. Let it go.
 
  • Like
Reactions: Sikh

Analog Kid

macrumors G3
Mar 4, 2003
9,360
12,603
The world is bigger than your home office :p
You truly are a joy to talk to...

Lol, I am glad you mention Microsoft and Amazon. If you seriously think that Azure and AWS just plop their hardware into cases and turn it on without spending massive effort on tuning and optimization then I don't know what to say. Big companies have been doing things like spending millions into RnD on optimized cooling, tuning and the whole 9 yards to squeeze more from their servers and they do it on scale.

Microsoft literally has research specifically into cost effective overclocking - https://www.microsoft.com/en-us/research/uploads/prod/2021/04/Zissou-Overclocking-ISCA21.pdf

Here are just few of the many resources you can easily find on the topic (either already deployed solutions or top tech companies like Microsoft doing serious RnD)


Some quotes from this article -
Gamers have long used the combination of overclocking CPUs and water cooling to squeeze peak performance out of PCs. Can that same approach create more powerful cloud computing platforms?
New research from Microsoft suggests that it might. The company has been test-driving the use of overclocked processors running in immersion cooling tanks, and says the combination allows servers to perform at a higher level.
“Based on our tests, we’ve found that for some chipsets, the performance can increase by 20 percent through the use of liquid cooling,” said Christian Belady, distinguished engineer and vice president of Microsoft’s datacenter advanced development group. “This demonstrates how liquid cooling can be used not only to support our sustainability goals to reduce and eventually eliminate water used for cooling in datacenters, but also generate more performant chips operating at warmer coolant temperatures for advanced AI and ML workloads.”
This also goes to show that for many modern chips, the cooler it is, potentially more performance can be had from it. Sure a chip can run fine at 100C, but cooling it down a bit more can give you more boost clocks and thus more performance (not talking specifically about m1 here but in general with modern chips, thermal headroom can literally mean more performance a lot of the time)
Glad that you mention tuning as well, because as I mentioned in multiple posts, OC tools are not always about overclocking, they are also about tuning. Not only do the major compute providers invest heavily to do their own tuning but they also expose ways for the end user to do hardware level tuning themselves. Here are some AWS docs that detail how their EC2 instances allow for fine tuned and granular tuning of both the cpu and gpu to suit your workload.

Another non-hobbyist serious scenario are big crypto farms that have millions worth of GPUs running with overclocked memory and finely tuned ASICS with full time staffed responsible for it.

I guess all of these are not serious enough things. You have to realize that just because you might not be aware of something, it does not mean that thing does not exist. The world is bigger than your home office :p

Ok, so I hadn't realized if I used the right BIOS setting, I'd suddenly find a 2-phase immersion cooler in my MBP, but I've learned something: you can make liquid from register settings.

So, let's see how your Googling lines up with your claims:

"Microsoft isn’t yet building entire data centers of immersion tanks"
So the answer to that one is no... As you say yourself, this is research.

“We anticipate observing an increase in the company’s hash rate and productivity through 2022, without having to rely solely on purchasing additional ASICs”
Cards based on custom ASICs, not standard PC hardware...

"The default C-state and P-state settings provide maximum performance, which is optimal for most workloads."
So changing from default isn't giving you even the 5-10% improvement you quoted, but it does mean you need to change the settings differently for different situations, as I suggested.

Macs don't use Nvidia cards, so there's not going to be a way to disable Nvidia autoboost, but there are ways to disable Intel's turbo boost if you want to:



I do appreciate you actually answering something with information in addition to the bile though. Yes, I concede that when the scale gets big enough, companies will invest in bespoke hardware and massive thermal dissipation systems to be able to run at higher clocks. They're doing it for 20% improvements on custom ASICs though, not 5% on a store bought PC. We know the big datacenters have been moving to custom hardware and massive cooling, that's not news.

That diversion is on me though. I should have been more precise. I thought it was obvious that custom hardware and dunk tanks was not what was being discussed. In this thread about laptops, what was being discussed was whether Apple should give you a setting to change to let you muck with low level processor settings like your windows machine does.

My feeling was that it's not worth the trouble in off the shelf hardware. It's not of value to enough people-- most people don't understand what they're doing and to really benefit you'd need to tune to specific workflows and, in the cases you're describing, keep your machine in a cryotank.

This also goes to show that for many modern chips, the cooler it is, potentially more performance can be had from it. Sure a chip can run fine at 100C, but cooling it down a bit more can give you more boost clocks and thus more performance (not talking specifically about m1 here but in general with modern chips, thermal headroom can literally mean more performance a lot of the time)

Yes, that is why "performance/watt" is really just "performance". Just about every system from iPhone to Datacenter is thermally limited, so therefore performance is limited by the thermals.

"Cooling it down" from 100C doesn't let you boost the clocks though. Extracting heat at a higher rate lets you run the clocks at a higher rate while keeping the temperate at or below 100C. If you run the fans more slowly, the processor heats up to 100C at a lower clock rate and throttles. If you run the fans more quickly, the processor can run at a higher clock rate before the temperature reaches 100C and it is forced to throttle. It'll run to 100C either way.

As I have told you here:
If you think you'll get better performance, just go to system preferences and choose "High Power".
here:
You can always go into SysPrefs and set the power mode to "High Power" which is described as possibly causing more fan noise but optimizing for demanding tasks.
and here:
Or "low power" vs "automatic" vs "high power"?

Apple does give you a setting to run the fans at a higher speed to open up more performance.
 

throAU

macrumors G3
Feb 13, 2012
9,199
7,354
Perth, Western Australia
You should not apologize. People who make you feel inferior for simply asking a question you should ignore. This is a forum to ask those very questions. Whether it has been addressed before or not it is still a valid question.

We come here to learn and not to insult others. I don't understand the condescending attitudes in many forums in general.

That's fine if you're asking a question; but if you are "warning people" of standard, within spec and expected behaviour, its a little less so.

There was no question in the original post, just unwarranted fear mongering ("the CPU is about to melt"), so it was called out for what it was. ?‍♂️

No harm no foul, guy now knows better, play on.


It's OK to be wrong, but if you are, and posting it like it is a fact then expect to be called out for it.
 
  • Like
Reactions: Analog Kid

TimmuJapan

macrumors 6502
Jul 7, 2020
373
651
I spent almost a decade being professionally responsible for IT of a mid-size university department. We almost exclusively use Mac laptops. I am talking about folks who run heavy duty statistical simulations in their laptops overnight and sometimes for days at a time. With all this, I don’t recall a single hardware failure that could be linked to heat.
When a computer dies mysteriously, are any of us really sure the exact reason, unless we study the board or components with some kind of specialized tools?

I had an iPad Air that died while gamiing, 2 weeks after I bought it..... It was certainly a defective product, but it was interesting that the constant gaming (which of course is very hot) over two weeks, seemed to bring forth its defect. Apple promptly replaced the device, and the one I have now still works great.

My MBP 2011's hard drive finally crashed after weeks and weeks of highly intensive tasks at the start of the COVID pandemic. It was constantly hot as h%ll, and at the end of a two week period, the hard drive died.

I also had a 2006 MacBook, where the board died, after a lot of consecutive heavy tasks with the CPU. Also blazing hot for several weeks consecutively on its deathbed.

So it is true that I don't know for certain if heat was the scientific reason for any of these failures... But all of these devices failed when they were noticeably hot, hot, hot--and had been for several weeks consistently. I would imagine that some of the devices that you looked after for years were also noticeably hot when they failed.... We can't say for certain that excessive heat was the cause, unless we study the board and the components with special tools, but the devices were certainly blazing hot and for long stretches when they failed, so there is some kind of connection.

I use a laptop fan stand now, repasted my other devices, and have adjusted the fan curves on my remaining Mac devices, some of which are very old. I haven't had any problems with hardware failure since then. Maybe there is a psychological comfort in keeping devices as cool as I can while using them, but everything that I have read about thermals and computers points to: a cool CPU and cool system is a happy CPU/system, and device longevity is connected to thermal performance/adequate cooling. I feel good about the fact that I air on the side of caution with thermals now and strive to keep my devices cool. It is comforting, and I think that I am taking good care of my devices by doing that, and hopefully they will last and last. My MacBook Pro 2011 with an SSD, my MacBook Pro 2012 with an SSD are still rocking, and I hope that my intel MacBook Air 2020 will benefit from the tweaks that Ive made to it to improve its cooling and thermal performance for the next decade.

But don't take it from me..... Take it from intel:
last sentence from the link below,
"....PC cooling isn’t just good practice. It’s also important for getting the best performance from your build, and for potentially increasing the lifespan of your components."

https://www.intel.com/content/www/u...g-the-importance-of-keeping-your-pc-cool.html

It doesn't matter if it is intel or apple silicon. You want to keep computers cool while they are running. It can increase the lifespan... Just like intel mentions in the above link. Apple most definitely prioritizes fan noise over the best possible cooling. If inadequately cooled components fail after 5 years instead of adequately cooled components failing after 10 years, this is great business for Apple, and keeps us coming back to buy new stuff, faster than we might actually want to......
 
  • Love
Reactions: asdex
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.