Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
So power consumption is nowhere near what Nvidia haters told us. It consumes about equal or less power than 3090, in some cases less than 3080, all while being more powerful. Avg. power consumption in gaming is 346W, so less than 3080Ti, 3090 and 3090Ti.

Here's my problem with it though... I can't fit two 4090s in my darn case as it seems... I was set on getting two of these. So either I have to look for water cooled versions and have to upgrade radiators or I settle for one 4090 or two 4080s... I really wanted 48GB, but that seems difficult now. Or wait and see how the RTX6000 Ada does which comes with 48GB... decisions, decisions.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
This chart is comical...
power-vsync.png
How is this calculated? It doesn’t seem accurate. I have a 3080 Ti and it draws higher than 110 watts.
 
  • Like
Reactions: Irishman

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
How is this calculated? It doesn’t seem accurate. I have a 3080 Ti and it draws higher than 110 watts.
All power consumption numbers reported on this page are "card only" values measured via the PCI Express power connector(s) and PCI Express bus slot. Everything is measured on the DC side, it's not the power consumption of the whole system. We conduct a physical measurement using professional lab equipment, the values are not software sensor readings, which are much less accurate.
  • Idle: Windows 10 sitting at the desktop (2560x1440) with all windows closed and drivers installed. The card is left to warm up in idle mode until power draw is stable.
  • Multi-monitor: Two monitors are connected to the tested card, and both use different display timings. One monitor runs 2560x1440 over DisplayPort, and the other monitor runs 1920x1080 over HDMI. The refresh rate is set to 60 Hz for both screens. Windows 10 is sitting at the desktop with all windows closed and drivers installed. The card is left to warm up in idle mode until power draw is stable. When using two identical monitors with the same timings and resolution, power consumption can be lower. When using high refresh rate monitors, power consumption can be higher than in this test.
  • Video Playback: We use VLC Media Player to watch a 4K 30 FPS video that's encoded with H.264 AVC at 64 Mbps bitrate, making it similar enough to many streaming services without adding a dependency on internet bandwidth. This codec has GPU-accelerated decoding on every modern GPU, so it not only tests GPU power management, but also efficiency of the video decoding hardware.
  • Gaming: Cyberpunk 2077 is running at 2560x1440 with Ultra settings and ray tracing disabled. We ensure the card is heated up properly, which ensures a steady-state result instead of short-term numbers that won't hold up in long-term usage.
  • Maximum: We use Furmark's Stability Test at 1920x1080, which results in very high no-game power consumption that can typically only be reached with stress-testing applications. All modern graphics cards have power limits, which are tested in this scenario. Our high-speed test equipment is able to capture power spikes that occur very quickly, before the power limiter on the graphics card can react.
  • V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics card that can't reach 60 FPS at 1080p, we report the power draw at the highest achievable frame rate.
  • Spikes: During all previous tests, we recorded the power draw and found the highest single reading, which is reported in this "Power Spikes" test. It provides additional insight into power supply requirements because large spikes can trigger various protections on some cheaper power supplies. A symptom of this is when your PC suddenly turns off when a game is starting, or during gameplay.
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
Don't tell Homie it's 43% faster while using 27% less power so way off from his 900W prediction.

1665512524847.png
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
To be fair you can find results where it pulls more power:
Well, the Furmark bench is still under 500W and we'll have to remain and see how realistic that is for real world applications as VRAM is still very limited with these cards. The new H100 cards have much more memory and are another beast, but these 40x0 cards are still "bottlenecked".

That 600W is the board limit, I doubt we'll reach that. In a similar way we could say the MBP M1 Max is much louder than the previous Intel MBPs, because when you spin the fans to the max manually, they indeed are. It's just that they never go there because it's not required.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
Well, the Furmark bench is still under 500W and we'll have to remain and see how realistic that is for real world applications as VRAM is still very limited with these cards. The new H100 cards have much more memory and are another beast, but these 40x0 cards are still "bottlenecked".

That 600W is the board limit, I doubt we'll reach that. In a similar way we could say the MBP M1 Max is much louder than the previous Intel MBPs, because when you spin the fans to the max manually, they indeed are. It's just that they never go there because it's not required.
GN got the card to pull 666.6 watts with a 33% PL increase using Furmark. So yeah it can gobble power. On the other hand, reducing the PL doesn't actually reduce performance as much as one would think. Der8auer has a video that shows that.

The Optimum Tech video shows some Optix scores and Apple has its work cut out for it here.


The other thing that is interesting is this is the first card that actually makes 4k native gaming possible, so much so that if you don't have at least that much monitor you are leaving performance on the table (aka you end up being CPU bound). With Apple having high resolution displays, the idea that you cannot max out game settings and get good performance (darn the power requirements, lol) sucks.
 

Homy

macrumors 68030
Jan 14, 2006
2,507
2,459
Sweden
Don't tell Homie it's 43% faster while using 27% less power so way off from his 900W prediction.

View attachment 2092917

You are for sure a true example of your own signature ”MR is a graveyard of misinformation”. I don’t know who ”Homie” is but it is also a good custom here on MR to use quotations or source links instead of spreading disinformation about people and subjects. If you’re referring to my post (since there’s no user called ”Homie”) here are some facts you didn’t bother or ”forgot” to get right about me and the post before writing:

1. I talked specifically about 4090 Ti, not 4090 FE.
2. It wasn’t my predictions but tech media reports. I even provided a source link.
3. My post and the reports were made in Aug before the release of 4090. 4090 Ti is still unreleased.
4. I and the reports said ”Up to 900W”, referring to the max TDP. One game bencnhmark for a different card doesn’t represent the max TDP of 4090 Ti.
5. The information about 4090 Ti is now missing from my source report but according to TechPowerup 4090 Ti will have a TDP of 800W.
6. My post was also only about maximum power consumption, not performance.

So in the future I suggest you spend more time on fact-checking and less on trying to be ”funny”.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
You are for sure a true example of your own signature ”MR is a graveyard of misinformation”. I don’t know who ”Homie” is but it is also a good custom here on MR to use quotations or source links instead of spreading disinformation about people and subjects. If you’re referring to my post (since there’s no user called ”Homie”) here are some facts you didn’t bother or ”forgot” to get right about me and the post before writing:

1. I talked specifically about 4090 Ti, not 4090 FE.
2. It wasn’t my predictions but tech media reports. I even provided a source link.
3. My post and the reports were made in Aug before the release of 4090. 4090 Ti is still unreleased.
4. I and the reports said ”Up to 900W”, referring to the max TDP. One game bencnhmark for a different card doesn’t represent the max TDP of 4090 Ti.
5. The information about 4090 Ti is now missing from my source report but according to TechPowerup 4090 Ti will have a TDP of 800W.
6. My post was also only about maximum power consumption, not performance.

So in the future I suggest you spend more time on fact-checking and less on trying to be ”funny”.
To be fair, it is unlikey nvidia will do a dual 16pin setup on the 4090ti FE. Which means it would be stuck pulling no more than 600W from the PSU. With there not being a kingpin model this time around I am somewhat curious to see if AIB's will bother doing such a ridiculous model.
 

Homy

macrumors 68030
Jan 14, 2006
2,507
2,459
Sweden
To be fair, it is unlikey nvidia will do a dual 16pin setup on the 4090ti FE. Which means it would be stuck pulling no more than 600W from the PSU. With there not being a kingpin model this time around I am somewhat curious to see if AIB's will bother doing such a ridiculous model.

"Unlikely" means it's unknown until the card is actually released. Until then you can't say "to be fair" about something non-factual. 🤷‍♂️
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
"Unlikely" means it's unknown until the card is actually released. Until then you can't say "to be fair" about something non-factual. 🤷‍♂️
I say it is unlikely because (so far) none of the partner models have more than 1 16-pin connector. For the 3090ti only 1 did, the EGVA 3090Ti Kingpin, and even that card only has enough 8-pins to draw 750W. Since EGVA isn't making cards anymore I am curious to know/see if anyone steps up to take their place for extreme overclocking.
 
  • Like
Reactions: Homy

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
4090s are actually in stock... that's surprising. Spoke to my usual go-to dealer today, their distributer has 12 4090s in stock. He's still trying to get info for the 4080. Tempted to order two new systems with dual 4090 each and Ryzen 7900X + 128GB RAM. Should be excellent for lightweight DL and graphics work and more than sufficient until the RTX6000 Ada comes out for which there doesn't seem to be a price or release date yet. I'm still torn between single 4090 and dual 4080 setup for my home office setup as I don't want to buy another new case.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Ah, yeah reviews across the board are basically saying it consumes less power gaming than a 3090ti while producing 40-70% more frames. EDIT: @4k.
I wanted clarification on that graph. Is it saying while gaming a 3080 Ti only takes 110 watts? If so I think mine is defective.
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
I wanted clarification on that graph. Is it saying while gaming a 3080 Ti only takes 110 watts? If so I think mine is defective.

Vsync 60Hz is capped at 60 fps.

On a side note, would be nice to see comparison of same dGPUs capped at 90fps and 120fps.
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Vsync 60Hz is capped at 60 fps.

On a side note, would be nice to see comparison of same dGPUs capped at 90fps and 120fps.
Is it Minecraft at 60fps or Cyberpunk? Even at 60fps my 3080 Ti has drawn around 400 watts.
 

mi7chy

macrumors G4
Oct 24, 2014
10,622
11,294
Is it Minecraft at 60fps or Cyberpunk? Even at 60fps my 3080 Ti has drawn around 400 watts.

Answered several posts back or from the article under Power Consumption Testing Details - Vsync.

https://forums.macrumors.com/thread...aming-merged-megathread.2321333/post-31610705

Highly doubt 3080ti is pulling 400W under those parameters. Show GPU Power from Hwinfo.

110W to 76W is -31% reduction in power so seems about right ballpark for Samsung 8nm to TSMC 4nm. Samsung 8nm is right behind TSMC N7.

wikichip_tsmc_logic_node_q2_2022-1.png
 

Ethosik

Contributor
Oct 21, 2009
8,142
7,120
Answered several posts back or from the article under Power Consumption Testing Details - Vsync.

https://forums.macrumors.com/thread...aming-merged-megathread.2321333/post-31610705

Highly doubt 3080ti is pulling 400W under those parameters. Show GPU Power from Hwinfo.

110W to 76W is -31% reduction in power so seems about right ballpark for Samsung 8nm to TSMC 4nm. Samsung 8nm is right behind TSMC N7.

wikichip_tsmc_logic_node_q2_2022-1.png
Ah thanks! I’m running 4K high settings when I tested Cyberpunk

Also I can get it to pull around 400 watts with some heavy video editing work. My entire system has pulled 650 watts from the wall before with the GPU being around 400.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
4090s are actually in stock... that's surprising. Spoke to my usual go-to dealer today, their distributer has 12 4090s in stock. He's still trying to get info for the 4080. Tempted to order two new systems with dual 4090 each and Ryzen 7900X + 128GB RAM. Should be excellent for lightweight DL and graphics work and more than sufficient until the RTX6000 Ada comes out for which there doesn't seem to be a price or release date yet. I'm still torn between single 4090 and dual 4080 setup for my home office setup as I don't want to buy another new case.
I wonder what folks think of the lack of nvlink for multi-gpu workloads.
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
I wonder what folks think of the lack of nvlink for multi-gpu workloads.
At first I thought "wtf?!?". But Nvidia specifically said they have no need for it anymore as PCIExpress 5 is fast enough to handle that on the consumer cards. They still have it on the pro cards like the H100, but that's much faster anyway, especially for the SXM form factor. But it's also a completely different price point and these systems are usually sold with 8 GPUs each where a single GPU goes for a little over $35k.

For games multi-gpu is probably a moot point, but I have some small to mid sized neural nets that can be trained on desktop hardware but 24GB VRAM is really not much. It can work with some tweaking, but more would be better. The RTX6000 will take care of that, the question is what gaming / 3D graphics performance will be like. Still waiting to see what benchmarks will say for DL, especially how much impact going from 6MB L2 cache to 72MB/96MB has.
 

diamond.g

macrumors G4
Mar 20, 2007
11,438
2,665
OBX
At first I thought "wtf?!?". But Nvidia specifically said they have no need for it anymore as PCIExpress 5 is fast enough to handle that on the consumer cards. They still have it on the pro cards like the H100, but that's much faster anyway, especially for the SXM form factor. But it's also a completely different price point and these systems are usually sold with 8 GPUs each where a single GPU goes for a little over $35k.

For games multi-gpu is probably a moot point, but I have some small to mid sized neural nets that can be trained on desktop hardware but 24GB VRAM is really not much. It can work with some tweaking, but more would be better. The RTX6000 will take care of that, the question is what gaming / 3D graphics performance will be like. Still waiting to see what benchmarks will say for DL, especially how much impact going from 6MB L2 cache to 72MB/96MB has.
Is that the gotcha? The current consumer 40-series cards are PCIe 4…
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,126
2,706
Is that the gotcha? The current consumer 40-series cards are PCIe 4…
They called it “limited PCIe Gen 5 support” at the conference several times. But that wording might have caused confusion.

https://www.windowscentral.com/hardware/computers-desktops/nvidia-kills-off-nvlink-on-rtx-4090
"The reason why we took [NVLink] out was because we needed the we needed the I/Os for something else, and so, so we use the I/O area to cram in as much as much AI processing as we could," Huang confirmed and explained of NVLink's absence.
"Because Ada [Lovelace] is based on PCIe Gen 5, we now have the ability to do peer-to-peer cross Gen 5 that's sufficiently fast, and that's a better trade off," Huang added.
Nvivia told TechPowerUp:
Ada does not support PCIe Gen 5, but the Gen 5 power connector is included. PCIe Gen 4 provides plenty of bandwidth for graphics usages today, so we felt it wasn't necessary to implement Gen 5 for this generation of graphics cards. The large framebuffers and large L2 caches of Ada GPUs also reduce utilization of the PCIe interface".
3090 (at least the regular one) never supported 3-way and 4-way configuration, limited PCIe5 support as they say with PCIe4 speeds should be fine. How well it works… we’ll see.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.