Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,517
19,664
I'm surprised that no one os bringing up the power draw of the 30xx series. People always harped on AMD for being "hot and loud" but now there's nothing but silence about the fact that their lowest power card announced draws damn near the amount of power their high-end card did two gens ago. Not even a peep about the absurd 350 watts (and more with board partner cards) that the 3090 draws?

I agree that the ever increasing power draw of GPU is quite regrettable. Unfortunately, this is a direct consequence of users wanting faster and faster hardware at the expense of everything else, it encourages the manufacturers to build stuff like that.

At the same time, the 3070 looks much more interesting. Nvidia did double the ALUs while Maintaining same power consumption here. I am curious to see the benchmarks.
 
  • Like
Reactions: burgerrecords

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,659
OBX
I agree that the ever increasing power draw of GPU is quite regrettable. Unfortunately, this is a direct consequence of users wanting faster and faster hardware at the expense of everything else, it encourages the manufacturers to build stuff like that.

At the same time, the 3070 looks much more interesting. Nvidia did double the ALUs while Maintaining same power consumption here. I am curious to see the benchmarks.
Some of the power draw appears to be misleading. According to the folks at B3D the 350W that the 3090 can draw (keep in mind this is a Titan RTX replacement card) is the whole card not just the GPU die. Most of the time it won’t pull that much power.

edit: the Titan RTX is $1000 more expensive than the 3090!! It also is rated for 320W max, so 30 more watts shouldn’t be that big of a deal for the performance increase.
 

burgerrecords

macrumors regular
Jun 21, 2020
222
106
Hot, expensive, and abysmal battery life

I go ahead plug my desktop computers directly into a surge protector, but i could see how it would be frustrating if you use battery. Since i don't place a desktop computer on my lap, heat isn't really much an issue either. A 10700 is about 300USD, not too bad in my experience.
 
  • Like
Reactions: 2Stepfan

MrX8503

macrumors 68020
Sep 19, 2010
2,293
1,615
I go ahead plug my desktop computers directly into a surge protector, but i could see how it would be frustrating if you use battery. Since i don't place a desktop computer on my lap, heat isn't really much an issue either. A 10700 is about 300USD, not too bad in my experience.

Most people own laptops. The desktop variants of Apple silicon will leapfrog Intel
 
  • Like
Reactions: 2Stepfan

russell_314

macrumors 604
Feb 10, 2019
6,658
10,259
USA
I know this thread is all about Apple silicon.. and to be honest, I am still very hyped out with all the Apple silicon mystery.. and I am pretty much bias when it come to evaluating Mac vs Windows..

But..... the Nvidia RTX 30 series which was just announced is..... killing me.. I am so into wanting to buy an Apple silicon Macbook latest by mid next year.. but this RTX 30 series is making me think twice.. and pricing seems reasonable for the performance now compared to the RTX 20 series..

I know.. everyone will say, if u think that way just go for windows.. why even bother still thinking of Mac..

Well, I know final decision will be up to myself.. just want to bring this up as a discussion to know each person opinion on this.. :)

Ideally, point of discussion will be even though Windows may have better graphics in the short term or near future.. what makes u still think Mac is a better choice if both os can work for u.. of course other discussion is welcome..
If you're in the market for a high end NVIDIA GPU then why are you looking at a Mac at all? I mean other than you having X amount to spend on a toy one isn't even relatable to the other. That's like saying you like the new Ford pickup truck for it's towing capacity but not sure if you should buy a Mustang LOL
 

jinnyman

macrumors 6502a
Sep 2, 2011
762
671
Lincolnshire, IL
If you're in the market for a high end NVIDIA GPU then why are you looking at a Mac at all? I mean other than you having X amount to spend on a toy one isn't even relatable to the other. That's like saying you like the new Ford pickup truck for it's towing capacity but not sure if you should buy a Mustang LOL
I agree.
If someone is considering going nVidia, why consider a mac?
I guess you can run it via bootcamp, but what's point of spending that much for MP 7,1 and run Windows for nVidia? I get you'd want to have one system for all your needs. But if you value Mac OS enough to go MP 7,1, you'd want to run Mac OS. Due to other unfortunate instances where you got to run nVidia, isn't it better to build a dedicated Windows PC with nVidia and run both?

If I were to get a custom tower Mac, I want all the internal area to be dedicated to Mac OS. Having both nVidia and AMD GPU for both Windows and Mac OS seems cumbersome and downright not appealing for me.
 

throAU

macrumors G3
Feb 13, 2012
9,198
7,346
Perth, Western Australia
I know this thread is all about Apple silicon.. and to be honest, I am still very hyped out with all the Apple silicon mystery.. and I am pretty much bias when it come to evaluating Mac vs Windows..

But..... the Nvidia RTX 30 series which was just announced is..... killing me.. I am so into wanting to buy an Apple silicon Macbook latest by mid next year.. but this RTX 30 series is making me think twice.. and pricing seems reasonable for the performance now compared to the RTX 20 series..

I know.. everyone will say, if u think that way just go for windows.. why even bother still thinking of Mac..

Well, I know final decision will be up to myself.. just want to bring this up as a discussion to know each person opinion on this.. :)

Ideally, point of discussion will be even though Windows may have better graphics in the short term or near future.. what makes u still think Mac is a better choice if both os can work for u.. of course other discussion is welcome..


All I'll say is this:

Why do you think Nvidia have released these so cheaply? They're not in the business of selling stuff cheap. They sell for as much as they think the market will bear. They also shot first without waiting for Big Navi to come out. Maybe just because they were ready - maybe they know something about Navi 2 performance we don't.

Which makes me think that as good as the Nvidia cards look vs. what's out today, I think we're going to see a lot of very high performance GPUs coming from other vendors (not just AMD) in the near future.

If Nvidia thought they had such a big performance lead, they wouldn't have dropped price to this degree.
 

russell_314

macrumors 604
Feb 10, 2019
6,658
10,259
USA
I agree.
If someone is considering going nVidia, why consider a mac?
I guess you can run it via bootcamp, but what's point of spending that much for MP 7,1 and run Windows for nVidia? I get you'd want to have one system for all your needs. But if you value Mac OS enough to go MP 7,1, you'd want to run Mac OS. Due to other unfortunate instances where you got to run nVidia, isn't it better to build a dedicated Windows PC with nVidia and run both?

If I were to get a custom tower Mac, I want all the internal area to be dedicated to Mac OS. Having both nVidia and AMD GPU for both Windows and Mac OS seems cumbersome and downright not appealing for me.
Exactly. These GPUs are meant for a gaming desktop running Windows. I love my Mac but comparing it to a gaming desktop is like comparing a compact car to a pickup truck. Maybe the OP just has some money burning a hole in his pocket and trying to decide what he wants to spend it on. In that case I would say if gaming was his biggest interest then get the card with a Windows PC. I mean if someone is considering spending $500+ on a GPU they either really enjoy playing games or their current GPU is something like an Nvidia 9 series or older
 
  • Like
Reactions: jerryk

Shivetya

macrumors 68000
Jan 16, 2008
1,669
306
I doubt we will see any ray tracing ability from Apple with any soon to arrive GPU. Apple has had more than enough years to have slipped in their own GPU into systems but did not and powering an iPad while it seems similar is far from it when it comes to the graphical complexity of PC and Console games.

As for why is the pricing so good on these new NVidia cards compared to previous tiers when it comes to performance, need chip processes and plants introduce efficiency and you gain wider and quicker adoption by not exploiting your audience; instead of the Apple route where many times they do.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,659
OBX
All I'll say is this:

Why do you think Nvidia have released these so cheaply? They're not in the business of selling stuff cheap. They sell for as much as they think the market will bear. They also shot first without waiting for Big Navi to come out. Maybe just because they were ready - maybe they know something about Navi 2 performance we don't.

Which makes me think that as good as the Nvidia cards look vs. what's out today, I think we're going to see a lot of very high performance GPUs coming from other vendors (not just AMD) in the near future.

If Nvidia thought they had such a big performance lead, they wouldn't have dropped price to this degree.
From my understanding they did this before when the 10 series cards came out. The odd ball was the 20 series cards where they raised prices but performance wasn’t that great.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
I doubt we will see any ray tracing ability from Apple with any soon to arrive GPU.

The interesting thing is that this year's Apple Metal ships with a rather feature-complete raytracing support, including goodies like dynamic linking of shader functions, user-programmable intersection shaders and recursive shader execution. Right now it runs in "software" (GPU-accelerated of course), but it seems to suggest that hardware raytracing is coming.
 
  • Like
Reactions: Roode

tdar

macrumors 68020
Jun 23, 2003
2,102
2,522
Johns Creek Ga.
This thread is full of what they called on the old Perry Mason show “ facts not in evidence.” Apple GPU ( yes that’s a thing now) is going to be very surprising to very many people. No one (that’s allowed to talk) has any idea about how it compares. But the tidbits that have leaked are mind blowing. I d say that we should wait for ( to complete my old tv show Theme ) “ the rest of the story.”
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
This thread is full of what they called on the old Perry Mason show “ facts not in evidence.” Apple GPU ( yes that’s a thing now) is going to be very surprising to very many people. No one (that’s allowed to talk) has any idea about how it compares. But the tidbits that have leaked are mind blowing. I d say that we should wait for ( to complete my old tv show Theme ) “ the rest of the story.”

Why, we have a fairly good idea. Apple GPUs are right here, in countless iPhones and iPads. It is not too difficult to extrapolate their performance when scaled up to a desktop-class.
 
  • Like
Reactions: sirio76

Shivetya

macrumors 68000
Jan 16, 2008
1,669
306
Why, we have a fairly good idea. Apple GPUs are right here, in countless iPhones and iPads. It is not too difficult to extrapolate their performance when scaled up to a desktop-class.

Sorry but that does not mean anything. They are likely no more than equivalent of integrated graphics and worse we have no proof Apple will have discreet memory for their solution. Rendering on a tablet or phone is not truly comparable to a desktop. The complexity of the games on iOS does not begin to compare.
 

leman

macrumors Core
Oct 14, 2008
19,517
19,664
They are likely no more than equivalent of integrated graphics

Define "integrated graphics" and what is bad about it. Isn't the performance what matters in the end? I mean, iPad Pro using a 2 year old mobile "integrated" chip has the same performance as a 50 watt two year old Nvidia GPU.

and worse we have no proof Apple will have discreet memory for their solution.

Actually, Apple has been dropping very clear hints that they are not going to use separate VRAM. They intend to do full unified memory. How they will achieve it technically, we don't know yep. Probably by using very fast RAM shared by CPU and GPU.

Rendering on a tablet or phone is not truly comparable to a desktop. The complexity of the games on iOS does not begin to compare.

Games on iOS are as complex as you make them to be. So far, the only "objective" compassions we have are GFXbench and compute benchmarks. To my knowledge, GFXbench uses exactly the same geometry to test mobile and desktop GPUs.
 
  • Like
Reactions: 2Stepfan

tdar

macrumors 68020
Jun 23, 2003
2,102
2,522
Johns Creek Ga.
Why, we have a fairly good idea. Apple GPUs are right here, in countless iPhones and iPads. It is not too difficult to extrapolate their performance when scaled up to a desktop-class.
AS is not just something new. It's a paradigm shift that is going to cause you to have to forget everything that you know. We will have to wait until all of the covers are pulled back before we can truly understand the entire story.
 

sirio76

macrumors 6502a
Mar 28, 2013
578
416
Sorry but that does not mean anything. They are likely no more than equivalent of integrated graphics and worse we have no proof Apple will have discreet memory for their solution. Rendering on a tablet or phone is not truly comparable to a desktop. The complexity of the games on iOS does not begin to compare.
I do not game but I use several Pro 3D software, some of this 3D software also have a mobile version(AutoCAD, FormZ, Rhino, ecc), I‘ve performed several test a few years ago(probably late 2015) using the first gen iPadPro and a maxed out MacPro 6.1, the results were better on the iPadPro, butter smooth scrolling in AutoCAD, and smooth spinning of 3D files in FormZ/Rhino. Of course when dealing with large files the iPadPro reach the limit much easier compared to a desktop but it’s mostly due to the small RAM/VRAM amount. I’m pretty confident that even an integrated GPU will do a more than decent job with 3D handling and after all ther’s no need to guess... we already seen that from the latest keynote, there was a Maya production scene running smoothly(and in emulation!) on the A12 equipped developer kit.
 

JMacHack

Suspended
Mar 16, 2017
1,965
2,424
Some of the power draw appears to be misleading. According to the folks at B3D the 350W that the 3090 can draw (keep in mind this is a Titan RTX replacement card) is the whole card not just the GPU die. Most of the time it won’t pull that much power.

edit: the Titan RTX is $1000 more expensive than the 3090!! It also is rated for 320W max, so 30 more watts shouldn’t be that big of a deal for the performance increase.
I mean, peak power draw is peak power draw, you need to account for that in your system. I mostly ignore the 3090 since it's a boutique item, I guess if someone wants the best of the best, cost, heat, and electricity be damned then that's their prerogative. The 3080 and 70 though I believe is the silly part. I'm honestly flabbergasted more by the overwhelmingly positive reaction to the marketing, where if it were another entity it would bring skepticism. It's one of the things that feeds my opinion that "gamers" are idiots.

I agree that the ever increasing power draw of GPU is quite regrettable. Unfortunately, this is a direct consequence of users wanting faster and faster hardware at the expense of everything else, it encourages the manufacturers to build stuff like that.
I disagree, Maxwell to Pascal had a performance jump similar to Turing to Ampere with power draw remaining the same or decreasing. In my opinion this seems like NVidia weren't hitting the performance targets they were expecting so they just clocked the nuts off the cards, power draw be damned.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,659
OBX
AS is not just something new. It's a paradigm shift that is going to cause you to have to forget everything that you know. We will have to wait until all of the covers are pulled back before we can truly understand the entire story.
For mobile rendering, maybe. Desktop rendering, I doubt it.
 

Andropov

macrumors 6502a
May 3, 2012
746
990
Spain
AS is not just something new. It's a paradigm shift that is going to cause you to have to forget everything that you know. We will have to wait until all of the covers are pulled back before we can truly understand the entire story.

Well we have had Apple Silicon on iPads for a while now, with the GPU using shared memory with the CPU, so it's not all that new. The scaling to desktop components it's going to be interesting though and likely well ahead of the competition but calling it a paradigm shift is a bit much IMHO.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,659
OBX
I mean, peak power draw is peak power draw, you need to account for that in your system. I mostly ignore the 3090 since it's a boutique item, I guess if someone wants the best of the best, cost, heat, and electricity be damned then that's their prerogative. The 3080 and 70 though I believe is the silly part. I'm honestly flabbergasted more by the overwhelmingly positive reaction to the marketing, where if it were another entity it would bring skepticism. It's one of the things that feeds my opinion that "gamers" are idiots.


I disagree, Maxwell to Pascal had a performance jump similar to Turing to Ampere with power draw remaining the same or decreasing. In my opinion this seems like NVidia weren't hitting the performance targets they were expecting so they just clocked the nuts off the cards, power draw be damned.
Boost clocks are the same. Base clocks are lower. The power draw is because they doubled+ the amount of cuda cores. I don’t see how Apple can match performance without doing the same.
 

Kostask

macrumors regular
Jul 4, 2020
230
104
Calgary, Alberta, Canada
They did not double the amount of CUDA cores. They have the same number of CUDA cores, but each CUDA core can now handle two sets of data, where the 20 series of cards could only handle one. Nvidia says they have doube the Cuda cores, but what they have is twice the data going through.

The thought is that nVidia is going to bury AMD's new Big Navi, when it comes out. What nVidia has done is paint themselves into a corner, in my mind. The 3090 will indeed be faster than the fastest Big Navi card. What AMD will do, and this is why I am saying that nVidia has painted themselves into a corner, is allow most of the Big Navi cards to use Crossfire. nVidia only has an SLI connector on the 3090, not on the 3070, 3080, or any other card. AMD could just say, if you want the fastest gaming solution, use two of our upper mid-range to tope end cards in Crossfire. There is a running chance that that will be a less expensive, an possibly, faster solution than an single 3090. Its not like you can use two 3070s or 3080s together. Sure, you could use two 3090s together, but at $1499 each, that is a fairly expensive solution.
 

diamond.g

macrumors G4
Mar 20, 2007
11,437
2,659
OBX
They did not double the amount of CUDA cores. They have the same number of CUDA cores, but each CUDA core can now handle two sets of data, where the 20 series of cards could only handle one. Nvidia says they have doube the Cuda cores, but what they have is twice the data going through.

The thought is that nVidia is going to bury AMD's new Big Navi, when it comes out. What nVidia has done is paint themselves into a corner, in my mind. The 3090 will indeed be faster than the fastest Big Navi card. What AMD will do, and this is why I am saying that nVidia has painted themselves into a corner, is allow most of the Big Navi cards to use Crossfire. nVidia only has an SLI connector on the 3090, not on the 3070, 3080, or any other card. AMD could just say, if you want the fastest gaming solution, use two of our upper mid-range to tope end cards in Crossfire. There is a running chance that that will be a less expensive, an possibly, faster solution than an single 3090. Its not like you can use two 3070s or 3080s together. Sure, you could use two 3090s together, but at $1499 each, that is a fairly expensive solution.
multi-gpu hasn’t been a popular solution to increasing graphics performance (as Apple found out with the trash can Mac Pro) for a while now.
And if you look at the number of shader processors it has doubled (2080 to 3080). Even the 3070 has more shader processors than the 2080.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.