Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JouniS

macrumors 6502a
Nov 22, 2020
638
399
850w TDP?!? https://wccftech.com/nvidia-geforce-rtx-40-ada-lovelace-gpus-september-launch-upto-850w-tdp-rumor/ - Surely, they cannot be serious? This should be the wattage for an entire system, not a single GPU.
Why not? There are many workloads where the power consumption is measured in tens or hundreds of kilowatts or even in megawatts. It doesn't make much difference if the basic compute unit uses 100 W or 1000 W if the overall efficiency is the same.

Some people are gullible and can be easily misled by marketing. They may buy a 4090 because their old GPU also had number 9. Others will realize that the actual successor to the 3090 is going to be the 4070 or the 4080, and the 4090 will be a new product in a category that has not existed before.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
Even if those rumors are true, where the MP will exceed is in specific tasks like video editing. While the raw performance of the GPU may not match a 4090, having custom accelerators like ProRes, and having extremely optimized applications like Final Cut, you may actually be able to do more on a MP than a "technically" more powerful desktop PC. Another added benefit is the unified memory. If rumors are true, a Mac Pro would have 128GB of unified memory, which the GPU has access to. The current 3090 maxes out at 24Gb I believe.
 

macacam

macrumors member
Feb 10, 2022
49
108
This click-bait post was successful, lots of replies :). On a more serious note, this is really a non-issue in macOS land. Doesn't matter how powerful the 'RTX 4090' will be, as it is unlikely you would be able to use it in a Mac anyway. On the flipside, it also doesn't matter how much power the 'RTX 4090' would use, as with workstations, power draw is not typically a concern, just speed. The faster you can produce work, the more money you can make.

Yeah the title is a bit baity and OP is jumping to conclusions, but I would imagine their are other people such as myself waiting for the SOC's gpu portions to be equivalent to, or exceed the power of our current devices. I'm kinda waiting for some sort of equivalent to my current card before buying a mac and they just aren't there yet. I'm also scratching my head at the amount of comments that kinda move the conversation to power draw and sound when I've been making my daily bread with far louder devices with little concern for their power consumption for the last 20 years. I guess trends are trendy for a reason.
 
Last edited:

vladi

macrumors 65816
Jan 30, 2010
1,008
617
Even if those rumors are true, where the MP will exceed is in specific tasks like video editing. While the raw performance of the GPU may not match a 4090, having custom accelerators like ProRes, and having extremely optimized applications like Final Cut, you may actually be able to do more on a MP than a "technically" more powerful desktop PC. Another added benefit is the unified memory. If rumors are true, a Mac Pro would have 128GB of unified memory, which the GPU has access to. The current 3090 maxes out at 24Gb I believe.
If ProRes is all you use and need than you should be fine. But if you throw in other codecs and raw videos ...
 

Zdigital2015

macrumors 601
Jul 14, 2015
4,143
5,622
East Coast, United States
Why not? There are many workloads where the power consumption is measured in tens or hundreds of kilowatts or even in megawatts. It doesn't make much difference if the basic compute unit uses 100 W or 1000 W if the overall efficiency is the same.

Some people are gullible and can be easily misled by marketing. They may buy a 4090 because their old GPU also had number 9. Others will realize that the actual successor to the 3090 is going to be the 4070 or the 4080, and the 4090 will be a new product in a category that has not existed before.
If NVIDIA introduces a 4090, it’s the successor to the 3090, just as a 4070 would be the successor to the 3070 and so on.

Unless you truly need the horsepower for your workflow, not gaming, most users want reasonable performance at lower wattage. If both Intel and NVIDIA are simply amping up their power consumption to increase their performance then they have lost the plot. Intel did a long time ago. At some point, electricity is going to get more expensive and a 250w TDP CPU and an 850w TDP GPU will be tossed out as the dinosaurs they are. Performance per watt is where it’s at, even with desktops. A kilowatt PSU for a desktop system is ridiculous, and a huge step backwards. It’s not innovation, it’s just lazy ass engineering by both Intel and NVIDIA.
 

Zdigital2015

macrumors 601
Jul 14, 2015
4,143
5,622
East Coast, United States
Yeah the title is a bit baity and OP is jumping to conclusions, but I would imagine their are other people such as myself waiting for the SOC's gpu portions to be equivalent to, or exceed the power of our current devices. I'm kinda waiting for some sort of equivalent to my current card before buying a mac and they just aren't there yet. I'm also scratching my head at the amount of comments that kinda move the conversation to power draw and sound when I've been making my daily bread with far louder devices with little concern for their power consumption for the last 20 years. I guess trends are trendy for a reason.
Trendy? Apple’s been pushing performance per watt since 2008 and the original MaBook Air. Intel has regressed and now NVIDIA is heading the same way.
 

macacam

macrumors member
Feb 10, 2022
49
108
Trendy? Apple’s been pushing performance per watt since 2008 and the original MaBook Air. Intel has regressed and now NVIDIA is heading the same way.
Yeah. It's never really been much of a consideration when purchasing computers and I don't really hear anybody talk about it outside of this forum. Most people just care about speed and getting things done quickly. I can't think of a single time I've been asked how much power a project has consumed now that I think about it. I have been asked how long things will take though.
 
Last edited:

JouniS

macrumors 6502a
Nov 22, 2020
638
399
If NVIDIA introduces a 4090, it’s the successor to the 3090, just as a 4070 would be the successor to the 3070 and so on.
This is backwards thinking. 3090 and 4090 are marketing names. Their purpose is to influence consumer behavior to benefit the vendor. If you want to make informed decisions as a consumer, you have to look at the products behind the names.

Buying a GPU involves making a trade-off between price, power, and performance. Based on what we know, the 4070 or the 4080 will probably offer the most similar trade-off to the 3090. The 4090 will be a more expensive product using a bigger chip with many more CUDA cores, offering significantly more performance while using significantly more power. This is no different from lower-end products, where it's well known that sometimes the true successor to the *70 in one generation is the *60 in the next generation, and sometimes the other way around.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
850w TDP?!? https://wccftech.com/nvidia-geforce-rtx-40-ada-lovelace-gpus-september-launch-upto-850w-tdp-rumor/ - Surely, they cannot be serious? This should be the wattage for an entire system, not a single GPU.

I have a heavy impact hammer drill here that punches holes in reinforced concrete like it’s nothing thst consumer less power.


Why not? There are many workloads where the power consumption is measured in tens or hundreds of kilowatts or even in megawatts. It doesn't make much difference if the basic compute unit uses 100 W or 1000 W if the overall efficiency is the same.

For a home/”normal” pro user? Come on, we are not talking about supercomputer equipment here. These are gaming GPUs. You are not buying a semi to go shop groceries, even though they are more “efficient” overall.
 
  • Like
Reactions: Zdigital2015

JouniS

macrumors 6502a
Nov 22, 2020
638
399
For a home/”normal” pro user? Come on, we are not talking about supercomputer equipment here. These are gaming GPUs. You are not buying a semi to go shop groceries, even though they are more “efficient” overall.
Consumer GPUs are used in many professional applications, including some large-scale ones, because they are more cost-effective than data center GPUs. They are also used for crypto mining for the same reason. And because they share the fundamental architecture, large data center GPUs can be easily transformed into consumer GPUs if Nvidia believes someone is willing to buy them.

Gaming used to be a middle-class hobby, where the average person could afford buying top-tier gear with sufficient dedication. That is changing. If enough people want to spend $20k or $50k on a gaming PC, manufacturers will start offering something for that money.
 

leman

macrumors Core
Oct 14, 2008
19,521
19,677
Consumer GPUs are used in many professional applications, including some large-scale ones, because they are more cost-effective than data center GPUs. They are also used for crypto mining for the same reason. And because they share the fundamental architecture, large data center GPUs can be easily transformed into consumer GPUs if Nvidia believes someone is willing to buy them.

Gaming used to be a middle-class hobby, where the average person could afford buying top-tier gear with sufficient dedication. That is changing. If enough people want to spend $20k or $50k on a gaming PC, manufacturers will start offering something for that money.

None of this is a justification to offer consumer computer hardware with those levels of power consumption. This is a backwards technological development and exact opposite of "doing things smart"
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
None of this is a justification to offer consumer computer hardware with those levels of power consumption. This is a backwards technological development and exact opposite of "doing things smart"
They are redefining the high end of consumer devices. If superyachts are consumer boats and private jets are consumer planes, there may also be a market for bigger and more expensive gaming PCs.
 

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
There will be some games that will never be satisfied with however much CPU and GPU power you throw at them.

But for the majority of games, should we care? You can make great games on a Switch, or on a Playstation 4, or on a Wii. It’s the same as with movies, where with great cinematography you can make every frame a painting. A game that includes great art is always going to be remembered, and a game that is a great game will stand out.
 

PsykX

macrumors 68030
Sep 16, 2006
2,745
3,922
3X more powerful in a single generation is a great prowess if it turns out to be true.

Apple Silicon's GPUs so far have been awesome in iPhones, but "okay" in computers.

Let's picture this to give us a better idea. The M1 8-core is about the same performance as the original PS4 and the M1 Max 32-cores is about as powerful as the PS5.

I believe the Mac Pro will have an M1 Max Quadro, which would result in 128-cores, which would be roughly 4X as powerful as the PS5 (assuming linear performance). Let's picture this again. Theoretically, you'll be able to play PS5 games in either 8K or 240fps without ray-tracing or 120fps with ray-tracing. Which is gorgeous, but then again, it has to be able to compete against NVIDIA's offering.

(Disclaimer : I know this is just chit-chat here, because obviously Macs don't run PS5 games)
 
Last edited:

Niceisnice

macrumors newbie
May 26, 2010
10
10
Clickbait headlines that indicate "In Trouble" or "Mac Killer" are so annoying. Tom's Hardware is infamous for them IMHO. Saying an unreleased product is in trouble due to an unreleased product that will be expensive, power-hungry, and in great demand does this forum no justice.
Power-hungry is not even the word for it! Nvidia's unreleased product should win the global warmer contest hands-down:

"Nvidia's next-gen graphics cards, will be out in September 2022, and will have a TGP up to 850W."
 
  • Like
Reactions: Zdigital2015

cp1160

macrumors regular
Feb 20, 2007
150
136
850w TDP?!? https://wccftech.com/nvidia-geforce-rtx-40-ada-lovelace-gpus-september-launch-upto-850w-tdp-rumor/ - Surely, they cannot be serious? This should be the wattage for an entire system, not a single GPU. Of course it will be faster, when you start pushing that much power through the GPU, the frequencies go up and so does the power. This sounds like NVIDIA is copying Intel's modus operandi of simply upping the power envelope and frequencies, then their stock price is simply not sustainable as it sounds like they are done innovating.
Power consumption at those levels is pushing the envelope of power distribution, motherboards, power supply connectors, and slots. It is all rumor still, but eye-opening. I have not bought a powerful new power supply, but the newest PCI Gen 5 supplies with PCIe Gen 5 connector support up to 600W power input per connector using the new 16 pin connectors. Using two or three 8 pin connectors may be required if not 16 pin.

This news from Gigabyte: "The UD1000GM PCI-E 5.0 power supply supports the PCIe Gen 5.0 graphics cards and it is capable of delivering the increasing power that the high-end graphics card demand. Traditional power supplies need three 8-pin to 16-pin adapters to support the latest PCIe Gen 5.0 graphics cards. The new UD1000GM PCI-E 5.0 power supply needs only a single 16-pin cable to directly supply power to the PCIe Gen 5.0 graphics cards. Moreover, the PCIe Gen 5.0 16-pin cable provides up to 600 watts of power to the graphics card

Eye-opening. 1000 Watt power supplies are not uncommon, but the power draw of a 750-watt graphics card, a high-end 11th Gen Intel Rocket Lake can pull 460 watts, which means a bigger supply or complex multi-supply is needed. A Corsair 1600 watt power supply runs $700-$725 today.

Heat output will be another factor. I cannot imagine the cooling needs of this card at full draw, particularly with the additional cooling of a Rocket Lake CPU and the massive power supply supporting the high-end system.

Power costs for a system running this configuration, based on the system cost calculator my company used, would be $1,100 to $2,700 per year depending on local power costs. The global impact is like burning 100 to 190 gallons of gasoline for a year. Hefty environmental price.

Whoa comes to mind.
 
Last edited:

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
What would be the consumption of Apple GPUs if they had the same performance and versatility as Nvidia GPUs?
 

coolguy4747

macrumors regular
Jun 26, 2010
233
269
I can't think of a single time I've been asked how much power a project has consumed now that I think about it.
What about when the electrical bill comes? Not saying it has been a consideration historically, but it sure would be nice to spend less on electricity to get the same job done.
 

cp1160

macrumors regular
Feb 20, 2007
150
136
Yeah the title is a bit baity and OP is jumping to conclusions, but I would imagine their are other people such as myself waiting for the SOC's gpu portions to be equivalent to, or exceed the power of our current devices. I'm kinda waiting for some sort of equivalent to my current card before buying a mac and they just aren't there yet. I'm also scratching my head at the amount of comments that kinda move the conversation to power draw and sound when I've been making my daily bread with far louder devices with little concern for their power consumption for the last 20 years. I guess trends are trendy for a reason.
I'm not challenging you. What do you do on your current rig that the GPU must be the equivalent? Gaming on a Mac is not a goal, so is it video? Photography? What is being harnessed that is important. My M1 Pro Max blows through video production like nothing else I have used.
 

cp1160

macrumors regular
Feb 20, 2007
150
136
What would be the consumption of Apple GPUs if they had the same performance and versatility as Nvidia GPUs?
Tough to conjecture fully. The site NoteBookCheck (see link) shows the overall graphic performance of the NVidia GeForce RTX 3090 as 162% of the M1 Max. So assuming things scale, then the power of 2 M1 Maxes would be better than the NVidia. At 350 watts (can pull higher to 425 in some cases), the power consumption of 2 M1 MAx chips would be 2 x 160 watts (32 core GPU M1 Max) or 320 watts (actual usage much less). The NVidia would be drawing 350 watts and the CPU likely another 140 to 180 watts for a total draw of 530 watts. More raw compute at significantly lower power draw. 650-watt draws are not uncommon for these systems.

https://www.notebookcheck.net/M1-Ma...o-14-Core-GPU_10970_10485_10965.247598.0.html
 
  • Like
Reactions: Xiao_Xi

januarydrive7

macrumors 6502a
Oct 23, 2020
537
578
Most users never think of wattage, or even know what it is for that matter. Performance is all they think about. Only Mac silicon users seem to care.
This is disingenuous -- most users do indirectly think of wattage, in the sense that most users think about battery life.
 

Krevnik

macrumors 601
Sep 8, 2003
4,101
1,312
They are redefining the high end of consumer devices. If superyachts are consumer boats and private jets are consumer planes, there may also be a market for bigger and more expensive gaming PCs.

They are going to have to start redefining the home wiring code if they keep going like this.

I'm not sure I fancy the idea of having to consider a dedicated 20A circuit for my workstation in the future.

the power consumption of 2 M1 MAx chips would be 2 x 160 watts (32 core GPU M1 Max) or 320 watts (actual usage much less).

160W... where did you get that figure? Notebook check states 104W for the M1 Max, but that's very likely "from the wall", not the package power. The package power maxes out around 85-90W.

What about when the electrical bill comes? Not saying it has been a consideration historically, but it sure would be nice to spend less on electricity to get the same job done.

This is part of it for sure. One thing to think about is that the GTX 1080 Ti was a sub-200W card. To jump up to 850W in three generations is not a small difference. That's going from "~3 incandescent bulbs" to "~14 incandescent bulbs".

Now I'm curious what happens if you start graphing CPU/GPU power draw over the last 10 years, since I think you'd see a bit of a spike in the last 5 years or so, meaning that yeah, folks saying that they generally haven't cared about power consumption aren't wrong, but that folks worrying about the increase in power consumption aren't worrying about nothing either.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.