Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

macacam

macrumors member
Feb 10, 2022
49
108
I'm not challenging you. What do you do on your current rig that the GPU must be the equivalent? Gaming on a Mac is not a goal, so is it video? Photography? What is being harnessed that is important. My M1 Pro Max blows through video production like nothing else I have used.
I work with a bunch of clothing companies. Basic simulation stuff. Nothing too fancy but it does require a decent amount of oomph in the ol' vidya card.
What about when the electrical bill comes? Not saying it has been a consideration historically, but it sure would be nice to spend less on electricity to get the same job done.
That would be pretty sweet. But when OP is talking about performance numbers and purely the cards ability to perform, it's really not a good argument to switch the topic to perf/watt. Just give the guy what he wants to hear and move on. Yeah, nvidia makes some great cards and they really powerful...but that's not the market Apple is currently targeting and we still have yet to see what their upper tier offerings are. That's pretty much the end of the story. If somebody needs the nvidia card, he can go get it and use it and that's fine. If he finds value in the future mac pro, hey, that's great too.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
This is disingenuous -- most users do indirectly think of wattage, in the sense that most users think about battery life.
I disagree. That's not my experience. They understand when the battery runs out and why it might need replaced, but that's about it. Most people I know always have it plugged in while working, and for sure they don't have a clue about battery wattages even more so than what a watt is.
 

Zdigital2015

macrumors 601
Jul 14, 2015
4,143
5,622
East Coast, United States
Unless time is more important, then doing things faster is king. (and smart)
The trend towards using more and more power (electricity) and requiring more and more cooling, depending on your office space, is fine as long as cheap power exists OR the user doesn’t pay the power and cooling bill. End users should be concerned about how much power their rig is using as this is ridiculous for 80% of users that really do not need that sort of horsepower. We seemed to be locked in an arms race now that consists of third rate engineering by both Intel and NVIDIA simply applying the “more must be better” principle to their newest products. I get the need to get work done, time is money, et al. but there is a point where we’re going backwards. It seems we have arrived at that point.
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
The most PRO feature of the new MBP: the sound of silence under metal redering using both GPU and CPU.

850W- earplugs included?
 
  • Like
Reactions: Zdigital2015

guzhogi

macrumors 68040
Aug 31, 2003
3,772
1,891
Wherever my feet take me…
850 watts seems excessive. If I remember correctly, regular US home electrical outlets are 125 volts, and circuits are 15 amps, 125 volts * 15 amps = 1875 watts. And if someone's going to get a 4090, I don't see them skimping on other components. So the 4090 + motherboard + CPU + RAM + monitor + whatever other components will take a good chunk of each electrical circuit. Some circuits are 20 amps which provide a bit more breathing room, but even that's finite. I'd really like to see Nvidia and other companies like Intel try to keep the same performance but lower power costs.
 

guzhogi

macrumors 68040
Aug 31, 2003
3,772
1,891
Wherever my feet take me…
Why not? There are many workloads where the power consumption is measured in tens or hundreds of kilowatts or even in megawatts. It doesn't make much difference if the basic compute unit uses 100 W or 1000 W if the overall efficiency is the same.

Some people are gullible and can be easily misled by marketing. They may buy a 4090 because their old GPU also had number 9. Others will realize that the actual successor to the 3090 is going to be the 4070 or the 4080, and the 4090 will be a new product in a category that has not existed before.
While you have a point, but not every person/company has access to kilowatts/megawatts of power. If they do, good for them. Even then, as I said in a previous comment, electrical circuits have a finite amount of wattage they can handle. Companies will need to find ways to either deliver for power, or design products that need less power.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
They are going to have to start redefining the home wiring code if they keep going like this.

I'm not sure I fancy the idea of having to consider a dedicated 20A circuit for my workstation in the future.
That's mostly an issue for those living in low-voltage countries. Most people living in a house with decent wiring can use 2 kW appliances without too much thinking. They can take ~3.6 kW from any socket as long as they ensure there is nothing else behind the same fuse.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
The trend towards using more and more power (electricity) and requiring more and more cooling, depending on your office space, is fine as long as cheap power exists OR the user doesn’t pay the power and cooling bill. End users should be concerned about how much power their rig is using as this is ridiculous for 80% of users that really do not need that sort of horsepower. We seemed to be locked in an arms race now that consists of third rate engineering by both Intel and NVIDIA simply applying the “more must be better” principle to their newest products. I get the need to get work done, time is money, et al. but there is a point where we’re going backwards. It seems we have arrived at that point.
Unless your a gamer or a miner, you probably would pay that much for a video card -- I know I wouldn't, so that level of power is kind of moot. And I don't mind miners having to pay more for electricity, maybe they'll stop.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,699
The most PRO feature of the new MBP: the sound of silence under metal redering using both GPU and CPU.

850W- earplugs included?
We have a server with redundant 875W PS's -- yes, you can hear it, but it's not that loud unless it's just booting, then all the fans go full speed for about 30 seconds. (There's no video card in at all, it's mainly for the Power9 Processor)
 

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
While you have a point, but not every person/company has access to kilowatts/megawatts of power. If they do, good for them. Even then, as I said in a previous comment, electrical circuits have a finite amount of wattage they can handle. Companies will need to find ways to either deliver for power, or design products that need less power.
Furthermore, when you are in the MW range, you will bother about the price of electricity. Sourcing electricity, the energy budget, cooling etc is a very important factor for building a super computers or data centres. Halving the electricity draw and corresponding heat for the same compute task will be transformative. I can´t see why workstations should not adhere to similar thinking. Perf/watt will win in the end as long as Apple don't price themselves out of the market.

I think it is time the IT industry actually put labels on environmental impact based on power draw of a device. We have it on television sets, refrigerators etc where I live.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I can´t see why workstations should not adhere to similar thinking. Perf/watt will win in the end as long as Apple don't price themselves out of the market.
Because there is usually only one workstation per user. In such situations, the biggest expense by far is the user. The second biggest expense tends to be the space the workstation is located in, and the workstation itself comes after that. Power is cheap enough that even if you use the workstation at full load 24/7, the workstation is going to be obsolete before power costs reach the purchase price.

Things are different in data centers, where multiple racks full of computers may serve a single user.
 
  • Like
Reactions: bobcomer

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Because there is usually only one workstation per user. In such situations, the biggest expense by far is the user. The second biggest expense tends to be the space the workstation is located in, and the workstation itself comes after that. Power is cheap enough that even if you use the workstation at full load 24/7, the workstation is going to be obsolete before power costs reach the purchase price.

Things are different in data centers, where multiple racks full of computers may serve a single user.
I see your point but 850 W hits a practical roof. How many of these could you add to a system before the PSU and apparently the electrical system is limiting? One actually so it would scale terribly so where will NVIDIA go from here? I think that are painting themselves into an electrical if not thermal corner unless they change approach.
 
  • Like
Reactions: Zdigital2015

JouniS

macrumors 6502a
Nov 22, 2020
638
399
I see your point but 850 W hits a practical roof. How many of these could you add to a system before the PSU and apparently the electrical system is limiting? One actually so it would scale terribly so where will NVIDIA go from here? I think that are painting themselves into an electrical if not thermal corner unless they change approach.
It would effectively be two previous-generation GPUs on the same chip, both faster and more power-efficient, with all the benefits you get from tighter integration. If you can cool the chip, cooling the system should not be a problem.
 

guzhogi

macrumors 68040
Aug 31, 2003
3,772
1,891
Wherever my feet take me…
Furthermore, when you are in the MW range, you will bother about the price of electricity. Sourcing electricity, the energy budget, cooling etc is a very important factor for building a super computers or data centres. Halving the electricity draw and corresponding heat for the same compute task will be transformative. I can´t see why workstations should not adhere to similar thinking. Perf/watt will win in the end as long as Apple don't price themselves out of the market.

I work IT in a school district. The main server/networking closet in my school is maybe 8' x 15'. When I first started working, we had a bunch of Xserves in there to handle DHCP, Open Directory, imaging, etc. One year, one of the servers kept failing every few weeks or so. I contacted my boss, and turns out it was overheating. We eventually got an air conditioning unit in there which prevented it from overheating & failing again. The district eventually consolidated a lot of the server stuff to either one of the middle schools or to cloud-based systems. But damn, a couple servers really push out quite a bit of heat.
 

Sydde

macrumors 68030
Aug 17, 2009
2,563
7,061
IOKWARDI
If Apple is going to release a high-end desktop Mac, why would we assume that, for the likely price, they would not include card slots? They are most likely not going to want to provide much support for the person who wants to stick in a ~KW GPU card, but if that is what the customer craves, they will have that option. So it seems unlikely on the face of it that the 4090 spells trouble for a Mac Pro.
 
  • Like
Reactions: bobcomer

oz_rkie

macrumors regular
Apr 16, 2021
177
165
It's pretty interesting to read the apple vs non apple comparisons in this thread. I use devices from apple/intel/amd/nvidia and the products offered by these companies for the most part are for very different use cases.

  1. Firstly, there is no way that any of the 4xxx series of nvidia cards to come, will consume 850W TDP. There is no scenario where Nvidia will release a single GPU consumer card that will consume that much power. I am surprised that people are even entertaining this rumor. The flagship card TDP of the next gen nvidia cards will be in the same ballpark as the current ones, and any increases will be minor. Current custom high end flagsips i.e. 3080tis/3090s cards can consume around 400-450W of power and it is likely that the next gen will be in this ballpark, maybe pushing toward the 500W mark. Why? Simply because we are already hitting limits of cooler sizes that can be reasonably fit into cases. An 850W TDP card will require exotic cooling.
  2. Currently yes, Apple is designing the most efficient CPU/GPU per watt. I am pretty sure no one can deny this and its great that Apple is doing this. It will definitely push the industry toward making more efficient parts. As a minor side note, people also seem to slightly ignore the fact that Apple does charge a fair premium on their devices. So, even if you are making an argument that less power consumption = less electricity bills etc, you do end up paying more upfront to buy into this more efficient machine. (which I think this argument is a weak one anyway, since for most serious work purposes in most countries, electricity bills are not high enough that a single computer with more or less wattage will make enough of a difference in your electricity bill to matter)
  3. Is efficiency all that you should care about? It depends on your use case. If you are a light user and want a laptop that is super portable and has a long battery life then yes, you should care about efficiency and this is where Apple is crushing it at the moment.
  4. On the other hand though, if you are someone who is doing work that requires the extra horse power and you are getting returns from it (i.e. faster compute = more work done in less time = more revenue generated etc), do you care about your machine using a few 100Ws more? No, absolutely not. As an example, if you are using your computer for making a living and if you can get a machine that uses a 100W more power from the wall but does even 20-30% more work, in 9 out of 10 scenarios, you would take that trade.

Bottom line is, yes its great that Apple is pushing for efficiency and I hope that intel/nvidia/amd will also start pushing more in this efficiency direction. But having said that, buy the tool that does the job you want it to do. If you want a super efficient machine that lasts a long while on battery, then great, buy accordingly. If you want a machine that can absolutely max out performance but does so at a higher power cost but the returns you get from it outweigh this, then also great, buy accordingly.
 

Jorbanead

macrumors 65816
Aug 31, 2018
1,209
1,438
  1. Is efficiency all that you should care about? It depends on your use case. If you are a light user and want a laptop that is super portable and has a long battery life then yes, you should care about efficiency and this is where Apple is crushing it at the moment.
I think efficiency is something everyone should care about. You even said it yourself, we are reaching the limits of cooler sizes. Having a more efficient chip means you can cram more power within the same enclosure under the same wattage and cooling systems. Every single computer is constrained by thermals, which is directly related to chip efficiency. Apple is playing the long game here, and no matter how you look at it, to me it seems efficiency is going to win.

GTX 980 - 165W
GTX 1080 - 180W
RTX 2080 - 215W
RTX 3080 - 320W

There is a trend here. Intel has done the same thing. Throw more power at the chip. The issue like you said, is we are reaching the limits of thermal solutions (without going to extreem measures).
 
  • Like
Reactions: Basic75

oz_rkie

macrumors regular
Apr 16, 2021
177
165
I think efficiency is something everyone should care about. You even said it yourself, we are reaching the limits of cooler sizes. Having a more efficient chip means you can cram more power within the same enclosure under the same wattage and cooling systems. Every single computer is constrained by thermals, which is directly related to chip efficiency. Apple is playing the long game here, and no matter how you look at it, to me it seems efficiency is going to win.

GTX 980 - 165W
GTX 1080 - 180W
RTX 2080 - 215W
RTX 3080 - 320W

There is a trend here. Intel has done the same thing. Throw more power at the chip. The issue like you said, is we are reaching the limits of thermal solutions (without going to extreem measures).

Yeah, I absolutely agree with you. Efficiency should be on everyones priority list, and there is no doubt that Apple is winning the efficiently battle right now (by a non-trivial margin too).

But at the same time, I am just outline the reality of how things are. If you are using your computer for getting actual work done and say you have machine 1 that can do 1 unit of work while consuming 100W from the wall and you have machine 2 that can do 1.25 units of work while consuming 175W from the wall, machine 2 is clearly less efficient than machine 1 which is a shame, but you would still get machine 2 (assuming how much work you can get is your main criteria).

And I am not saying which is machine 1 and which is machine 2 because that will depend on the use case. In certain workflows apple machines handily beat alder lake machines while being more efficient, while in other cases the alder lake machines will handily beat apple machines albeit at the cost of being power inefficient comparatively.

I am simply saying that while efficiency is awesome and we all would definitely want our chips to be more efficient, there will always be chips that are more or less efficient than the competition but in real world terms, you are mainly looking at two metrics in terms of performance, i.e. performance per watt and max absolute performance. Depending on your situation and what you use your device for, you would either favor the former or the latter.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
GTX 980 - 165W
GTX 1080 - 180W
RTX 2080 - 215W
RTX 3080 - 320W
Nvidia GPUs have increased their efficiency. Nvidia has increased the performance of its GPUs much more than their power consumption.

For example, Blender rendering time (BMW) has plummeted since OPTIX (ray tracing API).
GTX 980 - 100s (CUDA)
GTX 1080 - 83s (CUDA)
RTX 2080 - 30s (OPTIX)
RTX 3080 - 12s (OPTIX)



Apple has spoiled us because it can decrease power consumption and increase performance at the same time. But, that doesn't mean that other companies are not improving in efficiency.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
GTX 980 - 165W
GTX 1080 - 180W
RTX 2080 - 215W
RTX 3080 - 320W

There is a trend here. Intel has done the same thing. Throw more power at the chip. The issue like you said, is we are reaching the limits of thermal solutions (without going to extreem measures).
You are looking at short-term trends. The situation is different if you look further in the past:
  • GTX 280: 236 W (top model GTX 295: 289 W)
  • GTX 480: 250 W (already the top model)
  • GTX 580: 244 W (top model GTX 590: 365 W)
  • GTX 680: 195 W (top model GTX 690: 300 W)
  • GTX 780: 230 W (top model TITAN Z: 375 W)
Some GPU generations were constrained by power and cooling. Others were constrained by price and yields, and they ended up using less power. Some generations probably had other constraints.

The process improvements in the last few years have meant you can now pack an enormous amount of computing power in a small die. Apple chose not to do it, spending most of the die area on various controllers and special-purpose modules. It sounds like Nvidia is trying to take advantage of the entire die area, believing that they can somehow cool the chip.
 
  • Like
Reactions: JMacHack

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,628
1,101
According to MLID, the upcoming top Nvidia GPUs will consume around 400W with burst peaks of 500W and almost double their performance.

Nvidia's next GPUs should make a big leap in performance, as they will triple the transistor density with TSMC 5nm. Let's see what Nvidia can do with a node closer to the one Apple uses.
 
  • Like
Reactions: Krevnik

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Double the performance at what? You can torture numbers enough to make them say anything.

I have a difficult time believing we’ll get 100% more performance gen over gen when it’s usually around 30%.

I suppose it’s possible that the theoretical top end NVidia card could reach that performance if they moved to the most cutting edge node and threw as much rt cores as they could, and of course power efficiency be damned.

I’m sure the Mac Pro will be fine though. Alder Lake took the performance crown by going to nuclear temps and that didn’t shake the world.
 
  • Like
Reactions: Zdigital2015

iPadified

macrumors 68020
Apr 25, 2017
2,014
2,257
Nvidia GPUs have increased their efficiency. Nvidia has increased the performance of its GPUs much more than their power consumption.

For example, Blender rendering time (BMW) has plummeted since OPTIX (ray tracing API).
GTX 980 - 100s (CUDA)
GTX 1080 - 83s (CUDA)
RTX 2080 - 30s (OPTIX)
RTX 3080 - 12s (OPTIX)



Apple has spoiled us because it can decrease power consumption and increase performance at the same time. But, that doesn't mean that other companies are not improving in efficiency.
Efficiency wins as OPTIX gives larger jump in performance compared to raw power usage. Hence the power is used more efficiently. There is practical upper limit and the trick is to squeeze in as much performance in that power bracket as possible. Optimised software and SoCs are ways to do that.

It will be very interesting to see what path Apple takes with M1 in the field of high performance computing and the Mac Pro.
 
  • Like
Reactions: Bodhitree

Bodhitree

macrumors 68020
Apr 5, 2021
2,085
2,216
Netherlands
Most users never think of wattage, or even know what it is for that matter. Performance is all they think about. Only Mac silicon users seem to care.

I believe that performance is important until you get to “near instant”, then even 2x more doesn‘t matter anymore because you don’t feel the difference. Then you start thinking about other things like durability and power efficiency.
 
  • Like
Reactions: Ruftzooi

oz_rkie

macrumors regular
Apr 16, 2021
177
165
I believe that performance is important until you get to “near instant”, then even 2x more doesn‘t matter anymore because you don’t feel the difference. Then you start thinking about other things like durability and power efficiency.

'Instant' what though? For compute intensive tasks, there will never be 'instant'. There will always be performance gains to be made. If you are talking about simply instant opening of apps, then I don't think that's what is being implied here by performance. Sure, if you are doing only non compute intensive tasks with your machine (which is completely fine) then yeah, performance does not really matter.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.