Combine this with the upcoming 4000 series GPUs from NVidia and you could literally max out the capacity of a household circuit with one computer.
Electricity was supposed to be plentiful & cheap remember? https://emp.lbl.gov/news/berkeley-lab-study-investigates-how-plentifulI find these developments both embarrassing and concerning. This power consumption inflation has to stop. This is no innovation.
Electricity was supposed to be plentiful & cheap remember? https://emp.lbl.gov/news/berkeley-lab-study-investigates-how-plentiful
The power consumption here is nothing compared to a tumble dryer.i wonder how much is too much ?!
soon people who will power on their intel+nvidia high end computers in their home, the electric panel will switch off
Is it really inflation? Most boards wont support this mode, and for those that do they are gonna be expensive. Plus this isn't for air cooling, and I suspect a lot of AIO's to not be able to support the cooling requirements of this mode. It really appears geared towards folks that are serious about overclocking.I find these developments both embarrassing and concerning. This power consumption inflation has to stop. This is no innovation.
I get what you mean, and I'm the same way. I appreciate efficiency. But not everyone is like that, and if a resource is plentiful & cheap...why not? 20 years ago many people thought metered internet would be the norm. ;-)It's not about electricity costs, it's about basic common sense and responsible use of resources. A personal computer shouldn't users much power as a cooking appliance. Of course, there are always individuals and organisations that require more computational resources, and they should definitely have them, but running a CPU this far outside the optimal performance/power segment on the curve is not the way to go. That's what massive multicore platforms like Xeons and Threadrippers are for.
Is it really inflation?
I get what you mean, and I'm the same way. I appreciate efficiency. But not everyone is like that, and if a resource is plentiful & cheap...why not? 20 years ago many people thought metered internet would be the norm. ;-)
But we also had a noticeable power creep in both laptops and desktops over the last few years.
Oh right. 10nm for alder lake. I guess the halcyon days of 14nm are over.
I have never had a bad experience with intel (skipped pentium 4 completely, lol)… all my intel systems have been reliable and good performers. But their total stagnation (how the heck was a 4790k still competitive for 4+ product cycles?) is just arrogance.
I hope they get their **** together because AMD have demonstrated they’re not the good guys either, just look at the fiasco of zen3 on 300 series chipsets. Strong competition will benefit the consumer. But this design approach, if true shows they are kind of rudderless. Hopefully their r&d people are working on some good stuff.
Tbh I’m ready to see the return of Power in mainstream skus so we can have all out silicon war again.
I agree it's lazy but also cheaper for Intel. Increasing efficiency would require more resources. Pentium 4 all over again.Why not? Because it promotes lazy, wasteful engineering. And I am worried about the lasting effect it might have, as R&D resources are diverted from smart solutions and genuine improvements towards quick bandaids. Not to mention that pushing the power consumption as primary way to increase performance does not scale well. One is going to hit the limit — soon, at which point all of this stuff just comes crashing down quickly.
Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.I believe so, yes. Of course, the 350W TDP mode is a marketing gimmick and you'd need some very special hardware to even hope to utilise it. But we also had a noticeable power creep in both laptops and desktops over the last few years.
Why not? Because it promotes lazy, wasteful engineering. And I am worried about the lasting effect it might have, as R&D resources are diverted from smart solutions and genuine improvements towards quick bandaids. Not to mention that pushing the power consumption as primary way to increase performance does not scale well. One is going to hit the limit — soon, at which point all of this stuff just comes crashing down quickly.
Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.
Yeah, it looks like for Intel to increase single core performance they have to increase clock speed, which means they have to up power draw if they cannot reduce the node the die is made on.Well, Intel's mobile chips can't keep up with Apple Silicon and it spurs posts like the following...
Yeah, it looks like for Intel to increase single core performance they have to increase clock speed, which means they have to up power draw if they cannot reduce the node the die is made on.
I still remember the early 486 needing a heatsink (386 and below were mostly still running naked) and the DX2/4 needing a small fan. 400W PSU was considered big and plenty only needed when you planned to fill up a big tower with drives and cards.
Folks here are treating this news as is every Intel chip is going to draw 350W, and that isn't true. None of the actual mobile chips will, and a vanishingly small amount of desktop chips will. I am still not sure if accommodating Extreme OCing is wasteful engineering though.
Mobile high end Intel chip these days are advertised as 55W TDP, but you actually need 60W+ for them to operate at declared frequencies. Not to mention that they will often operate at much higher wattages. I mean, just check out the tests notebookcheck did on CPU efficiency: they had a 1240P record a 60W in Cinebench multicore. That’s twice over the nominal TDP limit, and one can’t argue that it’s a short-term boost limit either - that benchmark takes more than a few seconds to run. The 12700H averaged 70W - still 25W higher than the nominal TDP. And sure, one can argue that it’s the laptop vendors who set the limits higher but it only shows how widespread this practice became. Just a couple of years ago a 45W Intel CPU running multi core CB averaged, well, 45W, as expected. But recently the TDP became entirely meaningless, and it’s not that different for CPU clocks either.
AMD Ryzen 7 6800U Efficiency Review - Zen3+ beats Intel Alder Lake
We had to wait quite a while but we finally got our hands on AMD’s latest mobile processors from the Ryzen 6000U generation. We compare the Zen3+ cores of the new Ryzen 7 6800U with Intel’s current Alder Lake-P platform as well as Apple’s M1 Pro SoCs. Who builds the best chip for compact laptops?www.notebookcheck.net
We like to call those processors Intel Atom...I've read that they are going to make a mobile line with only efficiency cores. It should be interesting to see what the efficiency is like. There is still one program that I like to use from Windows and Ventura is going to include it so I should have all that I want on macOS when Ventura comes out. I haven't tried out Ventura yet but I'm tempted to try it out on a virtual machine on my 2014 iMac - I'm not sure if that will work as Ventura requires AVX2 I think on Intel CPUs and not sure if those instructions are emulated in VirtualBox. I tried to run a virtual machine with Ventura on my M1 Pro MacBook Pro but it wouldn't launch.
For whatever reason Intel has stopped including base frequency with the 12th gen CPU's so it is hard to say what the frequency should be for the "base" power draw. They do admit really high draw for Max Turbo though, so it isn't like they are lying. Vendors don't have a whole lot of choice in reigning in the power draw due to competition (why buy the Lenovo if the Dell can boost to higher frequencies, etc).Mobile high end Intel chip these days are advertised as 55W TDP, but you actually need 60W+ for them to operate at declared frequencies. Not to mention that they will often operate at much higher wattages. I mean, just check out the tests notebookcheck did on CPU efficiency: they had a 1240P record a 60W in Cinebench multicore. That’s twice over the nominal TDP limit, and one can’t argue that it’s a short-term boost limit either - that benchmark takes more than a few seconds to run. The 12700H averaged 70W - still 25W higher than the nominal TDP. And sure, one can argue that it’s the laptop vendors who set the limits higher but it only shows how widespread this practice became. Just a couple of years ago a 45W Intel CPU running multi core CB averaged, well, 45W, as expected. But recently the TDP became entirely meaningless, and it’s not that different for CPU clocks either.
AMD Ryzen 7 6800U Efficiency Review - Zen3+ beats Intel Alder Lake
We had to wait quite a while but we finally got our hands on AMD’s latest mobile processors from the Ryzen 6000U generation. We compare the Zen3+ cores of the new Ryzen 7 6800U with Intel’s current Alder Lake-P platform as well as Apple’s M1 Pro SoCs. Who builds the best chip for compact laptops?www.notebookcheck.net