What?! I said nothing about through-hole.
Trying to get out of that hole you dug for yourself?What causes capacitors to fail and pop? Excessive heat so it does play a role while you claim it doesn't.
Thanks to heatpipes and other modern cooling technologies, heat can be removed from high power silicon and transferred far away without putting too much thermal energy into other nearby components.
Presumably those are degrees Fahrenheit, in which case it is barely warm.This is what I sit at all day long, every day with my 16 inch MacBook Pro M1 Max, and I don't care. I trust Apple and their engineers enough that I don't lose sleep over this kind of stuff. If there's something wrong with the temps, the system should take care of it. If there is some catastrophic failure for some reason, that's why I have a Time Machine backup and AppleCare on the device.
I have the settings to System Controlled and notice my temps and fans are still at zero RPM. It just works! I've heard the fans kick on less than 10 times that I know of. Don't know how they do it, but I'm still continually amazed at the power and capabilities of these new chipsets.
View attachment 1967966
LMAO.Explain how heat pipe cool nearby capacitors?
Go back to where he said "electrolytic" which is different from polymer until he took the afternoon to Google search the difference.
I mean yeah, I just imagine these people acting this way If I brought this up in conversation in person. I'd be baffled. I guess it all depends on tone of voice too, which is hard here. Anyway, I come from the PC world and 100C is just insane. But if Apple says its okay to run it that hot, I guess that's okay...I wouldn't want to wear out the fans for no reason.
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way.Here's a faulty assumption you've made: that the CPU being at a high die junction temperature necessarily means the rest of the system's components must also become as hot. Thanks to heatpipes and other modern cooling technologies, heat can be removed from high power silicon and transferred far away without putting too much thermal energy into other nearby components.
How do you know the data's actually on your side? Did you do component level failure analysis to figure out what actually failed in your devices, and why, or did you just assume it was temperature because you're predisposed to think it's always temperature? Did you collect a much larger data sample since one person's devices aren't enough to be considered valid statistical data?
Why would you keep changing settings? You find the best OC at the best power and fan curve for your situation and call it a day.
That's a weird way to look at it. I have a high end gaming desktop that I spent a while to initially setup, dial in a good OC at a good voltage curve and tune the fans (maybe an hour or so to setup and a couple hours of prime 95 to validate) and then run it 24x7 with a good 5-10% free performance ALL the time. My friend who also has a gaming desktop does not know about overclocking or tuning and does not care about it which is completely fine. He does not bother ever going into the BIOS or installing and using intel XTU. He can still use his computer just fine for everything that he needs it for. But those avenues are still there for those that want to make use of them.
Its easy enough to hide 'advanced' options tucked away in the UI so casual users don't even see these options.
I don't think you are understanding what I am saying or perhaps purposefully ignoring it? I am not saying that every user should be tinkering or overclocking or what not. I was responding to some absurd points you were making.I think we obviously have different approaches to our hardware. I'd expect that, that's why there's options out there. But, while I understand that you're happy there are systems that let you tweak all the different parameters to see what you can squeeze out, I'd ask you to understand that there are people like me who want to keep things as simple as possible.
Maybe you don't remember the days of having to set your own IRQs and then discovering months later that all those frustrating crashes were because you introduced a subtle incompatibility. Or the days when MacOS would start and people would have a dozen system extensions that left everything configurable, but far less stable. I'm happy to leave that behind and just use a very good computer at very nearly optimal performance.
To me, 5-10% performance gains will be outdone by technology improvements in 6 months or so.
What does 5% better on a gaming desktop even mean? Instead of running at 60fps, it runs at 63fps? It sounds like bragging rights more than actual value for the effort. There's not anything wrong with that-- people have been hot rodding cars for decades because they enjoy it. But for daily driving, I don't want to have to search for my basic controls among a bunch of options I, and most people, don't care about. There are already way too many options in System Preferences as it is.
My computer is a tool. I buy it, I use it, eventually I replace it. I don't have the time or interest to spend hours polishing it to get 5% extra performance. Honestly, if a job takes an hour to run, that saves me 3 minutes. Am I monitoring my machine carefully enough for the 57 minutes it's busy to do anything useful with it in those 3 minutes? No. I'm on another machine doing something productive until it's done, I'm getting coffee while I wait, I'm sleeping while a batch runs.
If the job is short enough that I'm willing to sit and watch the progress bar, then 5-10% saves me so little time it's essentially imperceptible.
In a few years I'll get a machine that's 50% faster or more, often much more on specialized tasks. That makes a difference in my workflow in the sense that it changes what I can do while I wait versus what I do in the background, 5-10% doesn't do that.
And I'm certainly not willing to jeopardize reliability in the short term (kernel panic) or long term (failed components) to gain those few percent.
I had already explained using BIOS as just one example how it is completely possible to let advanced users experiment if they want to and more casual or uninterested users completely ignore. Feel free to use your computer at stock settings, that is perfectly fine, but saying stuff like having hidden away and completely ignorable advanced options hurts you just by existing is laughable.Having them does hurt me. It makes it harder to find the few things I actually care about. It makes it more likely that I forget to change them back to a different mode under different conditions.
Then don't. You don't have to OC or change settings you don't want to just because they are there.And I'm certainly not willing to jeopardize reliability in the short term (kernel panic) or long term (failed components) to gain those few percent.
And stuff like this where you pull anecdotal evidence out of your behind. What an absurd thing to say. How do you know that people who overclock don't benefit from it. Having a 5-10% (in some cases even more depending on your setup) of free performance ALL the time is enormously useful for many use cases. Stuff like using your machine as a build server compiling code all the time, dedicated video encoding machines, training AI models, just as few examples, essentially any sustained workloads that can fully load your cpu or gpu cores or benefit from memory throughput will benefit from overclocking.And I'm not sure that the people who make use from them actually benefit from them... As here, often it's people playing with things they don't understand and parroting what they see on YouTube or Reddit.
polymer
Be careful, he will throw more wikipedia links at you . Apparently the goto resource for Engineers.Search for "polymer electrolytic capacitor" on Digikey and you get nothing
https://www.digikey.com/en/products...BwPYBsBPOAUyUxPxIGM0kDC0BLazalbFaptXJEALoBfIA
You have to search for "polymer capacitor" and under type it even says polymer.
https://www.digikey.com/en/products...YFcBOBDALigBABwPYBsBPOAUyUwGMVsUKBLNXJEAXQF8g
If you don't think temperature plays a role open your eyes and look at "lifetime @ temp" column. Majority are 1000 hours @ 105C. What do you think happens at >105C? It shortens lifetime or pops. Basic stuff even a hobbyist knows.
I don't think you are understanding what I am saying or perhaps purposefully ignoring it?
They will benefit from a higher clock, but I remain unconvinced that anyone seriously pursuing those workloads benefit from overclocking. Do you imagine this is ever done in a production environment? By which I mean somewhere with the resources, capital and labor, to invest in purchasing or developing the right tools.Stuff like using your machine as a build server compiling code all the time, dedicated video encoding machines, training AI models, just as few examples, essentially any sustained workloads that can fully load your cpu or gpu cores or benefit from memory throughput will benefit from overclocking.
Lol, I am glad you mention Microsoft and Amazon. If you seriously think that Azure and AWS just plop their hardware into cases and turn it on without spending massive effort on tuning and optimization then I don't know what to say. Big companies have been doing things like spending millions into RnD on optimized cooling, tuning and the whole 9 yards to squeeze more from their servers and they do it on scale.Jebus, this place has gotten toxic...
Then you go on to completely ignore what I'm saying... Not to mention that you keep talking about overclocking when the OP is clearly concerned about overheating and long term reliability.
But I do have to ask about this, though:
They will benefit from a higher clock, but I remain unconvinced that anyone seriously pursuing those workloads benefit from overclocking. Do you imagine this is ever done in a production environment? By which I mean somewhere with the resources, capital and labor, to invest in purchasing or developing the right tools.
If you can point me to an article somewhere where a non-hobbyist is using overclocking in a serious way, for more than a YouTube headline, I'd be interested to see what they've found. Is Amazon tuning up the machines in their server rooms to maximize their fully loaded CPUs and GPUs? Is Microsoft setting BIOS parameters and stability testing their build machines? Pixar? They buy a ton of hardware, so I'd imagine if it was beneficial at the granular level then that benefit would scale up.
Or are they all fools too?
What causes capacitors to fail and pop? Excessive heat so it does play a role while you claim it doesn't.
My 2012 15" MBP that runs at 100C day after day is still running strong. ?♂️Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way.
You are a lucky person, @januarydrive7.My 2012 15" MBP that runs at 100C day after day is still running strong. ?♂️
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way.
Okay, well, there is an easy solution to this debate-- @mr_roboto , just let your hardware bake and bake, run at 100 C for long, long stretches each day, year after year after year.... and after 10 years, please dig up this post, and let us know how your hardware is doing.... You seem enthusiastic about letting your computer run at 100 C, non-stop, fans barely kicking in, for years and years--so I say, go for it, dude! Rock that hot CPU, work it to the bone non-stop, cook that laptop of yours, and let us know how it is doing after 10 years.... Better still, give us an update after 5 years. I hope that this laptop gives you many years of productivity, joy and titillating, warm thighs. Whether or not it dies after a few years, it will definitely keep you warm on those cold winter nights, so that will be a plus either way.
You truly are a joy to talk to...The world is bigger than your home office
Lol, I am glad you mention Microsoft and Amazon. If you seriously think that Azure and AWS just plop their hardware into cases and turn it on without spending massive effort on tuning and optimization then I don't know what to say. Big companies have been doing things like spending millions into RnD on optimized cooling, tuning and the whole 9 yards to squeeze more from their servers and they do it on scale.
Microsoft literally has research specifically into cost effective overclocking - https://www.microsoft.com/en-us/research/uploads/prod/2021/04/Zissou-Overclocking-ISCA21.pdf
Here are just few of the many resources you can easily find on the topic (either already deployed solutions or top tech companies like Microsoft doing serious RnD)
Overclocking the Cloud? Immersion Cooling Could Enable Faster Servers
Microsoft has been test-driving the use of overclocked processors running in immersion cooling tanks, and says the combination can boost server performance by 20 percent. The ...datacenterfrontier.com
Some quotes from this article -
Gamers have long used the combination of overclocking CPUs and water cooling to squeeze peak performance out of PCs. Can that same approach create more powerful cloud computing platforms?New research from Microsoft suggests that it might. The company has been test-driving the use of overclocked processors running in immersion cooling tanks, and says the combination allows servers to perform at a higher level.“Based on our tests, we’ve found that for some chipsets, the performance can increase by 20 percent through the use of liquid cooling,” said Christian Belady, distinguished engineer and vice president of Microsoft’s datacenter advanced development group. “This demonstrates how liquid cooling can be used not only to support our sustainability goals to reduce and eventually eliminate water used for cooling in datacenters, but also generate more performant chips operating at warmer coolant temperatures for advanced AI and ML workloads.”This also goes to show that for many modern chips, the cooler it is, potentially more performance can be had from it. Sure a chip can run fine at 100C, but cooling it down a bit more can give you more boost clocks and thus more performance (not talking specifically about m1 here but in general with modern chips, thermal headroom can literally mean more performance a lot of the time)
Glad that you mention tuning as well, because as I mentioned in multiple posts, OC tools are not always about overclocking, they are also about tuning. Not only do the major compute providers invest heavily to do their own tuning but they also expose ways for the end user to do hardware level tuning themselves. Here are some AWS docs that detail how their EC2 instances allow for fine tuned and granular tuning of both the cpu and gpu to suit your workload.
Processor state control for Amazon EC2 Linux instances - Amazon Elastic Compute Cloud
Some Amazon EC2 instance types provide the ability for an operating system to control the processor C-states (sleep levels) and P-states (CPU frequency).docs.aws.amazon.comOptimize GPU settings on Amazon EC2 instances - Amazon Elastic Compute Cloud
Learn how to configure GPU setting optimizations on NVIDIA GPU instances.docs.aws.amazon.com
Another non-hobbyist serious scenario are big crypto farms that have millions worth of GPUs running with overclocked memory and finely tuned ASICS with full time staffed responsible for it.
I guess all of these are not serious enough things. You have to realize that just because you might not be aware of something, it does not mean that thing does not exist. The world is bigger than your home office
This also goes to show that for many modern chips, the cooler it is, potentially more performance can be had from it. Sure a chip can run fine at 100C, but cooling it down a bit more can give you more boost clocks and thus more performance (not talking specifically about m1 here but in general with modern chips, thermal headroom can literally mean more performance a lot of the time)
here:If you think you'll get better performance, just go to system preferences and choose "High Power".
and here:You can always go into SysPrefs and set the power mode to "High Power" which is described as possibly causing more fan noise but optimizing for demanding tasks.
Or "low power" vs "automatic" vs "high power"?
You should not apologize. People who make you feel inferior for simply asking a question you should ignore. This is a forum to ask those very questions. Whether it has been addressed before or not it is still a valid question.
We come here to learn and not to insult others. I don't understand the condescending attitudes in many forums in general.
When a computer dies mysteriously, are any of us really sure the exact reason, unless we study the board or components with some kind of specialized tools?I spent almost a decade being professionally responsible for IT of a mid-size university department. We almost exclusively use Mac laptops. I am talking about folks who run heavy duty statistical simulations in their laptops overnight and sometimes for days at a time. With all this, I don’t recall a single hardware failure that could be linked to heat.