Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

vladi

macrumors 65816
Jan 30, 2010
1,008
617
Why wouldn't you go the Ax000 route? Usually those cards have a larger framebuffer and more cores than their consumer counterparts.

Quadros are waste of money down the line. In rare situations where you need load of vram it might work out for you but those are very specific workflows. If you are into memory then GPU farming is where you go to, I don't know no one who hangs with single A6000 in the tower. For the A6000 money you can get at least three 4090 which would make a killin on pretty much anything you do. Not sure about machine learning on gaming GPUs but I know for a fact Titan can pull it off. I used to buy RTX Titans but I will not even go for a Titan anymore, no need.
 

diamond.g

macrumors G4
Mar 20, 2007
11,455
2,685
OBX
wccftech posted an article about power draw of the 4090 compared to the 3090ti. This level of power draw isn't too crazy when doing a direct comparison...
Screen Shot 2022-09-23 at 5.03.37 PM.png
 

velocityg4

macrumors 604
Dec 19, 2004
7,340
4,727
Georgia
Quadros are waste of money down the line. In rare situations where you need load of vram it might work out for you but those are very specific workflows. If you are into memory then GPU farming is where you go to, I don't know no one who hangs with single A6000 in the tower. For the A6000 money you can get at least three 4090 which would make a killin on pretty much anything you do. Not sure about machine learning on gaming GPUs but I know for a fact Titan can pull it off. I used to buy RTX Titans but I will not even go for a Titan anymore, no need.

Don't Quadros still get much better double precision performance or did nVidia stop with that delineation between the lines?
 

vladi

macrumors 65816
Jan 30, 2010
1,008
617
Don't Quadros still get much better double precision performance or did nVidia stop with that delineation between the lines?

Not sure but at one point the only difference was the drivers and a soldered jumper on the board that used to act like a ID controller. I don't use it for simulations, calculations, ai or similar stuff. Nor do I use CATIA, Solidworks and other CAD stuff that is optimized for Quadro. Solidworks is just wierd, first they were pushing OpenCL like crazy while CUDA was not supported at all. Then they switched to CUDA but they optimized for Quadro so you would get better performance on it. Whatever.

I use my GPU for 3D rendering, vfx and video editing and for quite some time gaming GPUs have worked just fine. If you want to do Arnold render of animated movie at once on a single GPU then you might need more VRAM than what RTX offers. That's the scenario that Apple has used on its Mac Studio against RTX 3090 cause it has shared memory but that scenario is so out of reality.
 

DoFoT9

macrumors P6
Jun 11, 2007
17,586
100
London, United Kingdom
It is even worse, DLSS 3 won't work on prior cards, so the frame gen tech that can double frame rate is only available with the 40 series cards.

And yeah the 4080 12GB is basically a 4070 if you go by 30 Series die allocation.
Do you have a link for DLSS not working on previous cards? I hadn’t seen that myself, although I assumed as much. It does make sense though.

The 4080 12GB is really disappointing, and I agree with the tech YouTuber’s that they should have just gone with what they originally planned, and called it the 4070…
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
so there is some hope for us 20 series owners.
My opinion of Nvidia hasn't been any lower then it is now and as such, I really think its more of a business decision to limit DLSS 3 to the 40 series. They're really looking to entice or pressure people to upgrade.

I'm still pretty happy with my RTX 2060 super, though I'll consider upgrading that if AMD's event in November has something for me
 
  • Like
Reactions: Donfor39 and DoFoT9

DoFoT9

macrumors P6
Jun 11, 2007
17,586
100
London, United Kingdom
My opinion of Nvidia hasn't been any lower then it is now and as such, I really think its more of a business decision to limit DLSS 3 to the 40 series. They're really looking to entice or pressure people to upgrade.

I'm still pretty happy with my RTX 2060 super, though I'll consider upgrading that if AMD's event in November has something for me
I concur, wholeheartedly. I can’t quite put my finger on it though. Too much 30 series surplus saturating the market? The recent crypto changes adding to the surplus? 40 series needing to ‘seperate’ itself from general stock? Greed? I’m not in a position to even have an opinion - this is what I’ve seen from reviews and the likes.

I’m in the same boat as you! My 2070 SUPER is awesome, even to this day. It does struggle if I overload OBS whilst I game and stream at the same time, but 4K@60 is totally fine for me and my poor Tarkov skills :D
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
More bad press regarding the 40 series

First the laughable news that Nvidia is "unlaunching" the
Nvidia will “unlaunch” the 12GB RTX 4080 While it states that the name was poorly chosen (yes it was), I think there's other factos, such as AMD's immenent launch where you can bet they would have used the lower performing, poorly named RTX 4080 as an example of how much better they're cards are. This screws over the AIB makers, since they were readying their inventory for the sale of this board. Now its just going to sit there, until the dust settles and Nvidia launches a 4070.

Now it seem more bad news.
https://www.reddit.com/r/nvidia/comments/yc6g3u
This is reddit thread is starting to hit the news sites, and while this one thread doesn't indicate a product wide issue, it certainly raises a number of red flags

Buildzoid mentions a few good points where this adapter has less pins, but is to carry up to 600 watts, but I guess these cards are limited to 450 watts. Using I=P/V , we get 37 amps of power going through smaller, fewer less robust pins - talk about a recipe of disaster.

 

diamond.g

macrumors G4
Mar 20, 2007
11,455
2,685
OBX
More bad press regarding the 40 series

First the laughable news that Nvidia is "unlaunching" the
Nvidia will “unlaunch” the 12GB RTX 4080 While it states that the name was poorly chosen (yes it was), I think there's other factos, such as AMD's immenent launch where you can bet they would have used the lower performing, poorly named RTX 4080 as an example of how much better they're cards are. This screws over the AIB makers, since they were readying their inventory for the sale of this board. Now its just going to sit there, until the dust settles and Nvidia launches a 4070.

Now it seem more bad news.
https://www.reddit.com/r/nvidia/comments/yc6g3u
This is reddit thread is starting to hit the news sites, and while this one thread doesn't indicate a product wide issue, it certainly raises a number of red flags

Buildzoid mentions a few good points where this adapter has less pins, but is to carry up to 600 watts, but I guess these cards are limited to 450 watts. Using I=P/V , we get 37 amps of power going through smaller, fewer less robust pins - talk about a recipe of disaster.

I saw that, I wonder if the adapter is at fault, especially the whole having to bend the cabling to fit in cases.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
I saw that, I wonder if the adapter is at fault, especially the whole having to bend the cabling to fit in cases.
There's less pins, 6 vs 9 I think and so each pin has to carry more amperage. Bending them, particularly horizontally as shown in the YT, seems negatively cause poor connections maybe arcing, or maybe overloading the remaining pins for power they're not really designed for.
 

diamond.g

macrumors G4
Mar 20, 2007
11,455
2,685
OBX
There's less pins, 6 vs 9 I think and so each pin has to carry more amperage. Bending them, particularly horizontally as shown in the YT, seems negatively cause poor connections maybe arcing, or maybe overloading the remaining pins for power they're not really designed for.
I suspect right angle adapters will become popular.
 

MacCheetah3

macrumors 68020
Nov 14, 2003
2,287
1,234
Central MN
Quadros are waste of money down the line. In rare situations where you need load of vram it might work out for you but those are very specific workflows. If you are into memory then GPU farming is where you go to, I don't know no one who hangs with single A6000 in the tower. For the A6000 money you can get at least three 4090 which would make a killin on pretty much anything you do. Not sure about machine learning on gaming GPUs but I know for a fact Titan can pull it off. I used to buy RTX Titans but I will not even go for a Titan anymore, no need.
Not sure but at one point the only difference was the drivers and a soldered jumper on the board that used to act like a ID controller.
If you want to do Arnold render of animated movie at once on a single GPU then you might need more VRAM than what RTX offers.
From what I’ve read, the GPU architecture is basically the same. However, the professional RTX (no longer emblazon Quadro) feature ECC GDDR6, much more VRAM capacity, blower-style coolers for easier/better multi-card setups, and vGPU Software Support via the Studio driver (but only unlocked in conjunction with the professional VBIOS, I believe).

Beyond price, the performance is actually a little slower. Even though the RTX A6000 includes the full GA102 die like the 3090 Ti (though the GPU might be labeled differently), the VRAM is GDDR6 versus GDDR6X on the high end GeForce and the A6000 power limit is notable lower.




Granted, these differences are reasonably to prioritize stability/reliability, which helps to (at least somewhat) justify the price and allow for certifications with software providers.

Nonetheless, there are workloads/projects in which the high-end GeForce cards can be far more cost effective and provide good results.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Looks like AMD isn't adopting that new connector that seems to be melting people's 4090 - I'm guessing this connector will fall from use pretty quickly
1666865610243.png
 

diamond.g

macrumors G4
Mar 20, 2007
11,455
2,685
OBX
Looks like AMD isn't adopting that new connector that seems to be melting people's 4090 - I'm guessing this connector will fall from use pretty quickly
View attachment 2103046
Igor says it isn't the connector it is nvidia's crap adapter. What isn't known is if nvidia knows the adapter design is garbage or not (well I guess after seeing Igors report they should know now, lol)

 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Igor says it isn't the connector it is nvidia's crap adapter
I'm no engineer, so I can't say it is or isnt. I just go on what I find on the internet :) With that said however, a brand-new adapter that is showing a propensity to melt, with other vendors choosing not use it, can only spell doom for its future imo.
 

diamond.g

macrumors G4
Mar 20, 2007
11,455
2,685
OBX
I'm no engineer, so I can't say it is or isnt. I just go on what I find on the internet :) With that said however, a brand-new adapter that is showing a propensity to melt, with other vendors choosing not use it, can only spell doom for its future imo.
Yeah it isni't a good look. The stupid part is they used "the same connector*" for the 3090ti and we didn't get these reports of melting.

*technically the newer connector has 4 more pins which makes it different (and supposedly smarter) than the old one but that all matters for naught if the adapter melts. Adapters from PSU makers don't seem to have this issue (looking at the corsair version shown below) because it doesn't bridge extra power and ground like the nvidia version.

-base-CP-8920284-Gallery-PCIe-5-12VHPWR-PSU-Cable-Type4-01.png_515Wx515H



I think AMD is right to skip it this time around, but you also have to figure that means none of their cards (at least from a reference point of view) are going to pull more than 450W (which isn't a bad thing per se).
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Looks like the issue was mainly folks not plugging in the connector all the way.
Yeah, I saw that, but there seems to be a design flaw nonetheless. Still, while I'm no longer an Nvidia fan they're doing the right thing and covering the replacement for any that had an issue.

A subsequent video on this, has Nvidia's response and it looks like so far only 50 cards or .04% of what was shipped has been affected.
 
  • Like
Reactions: TSE and diamond.g

MacCheetah3

macrumors 68020
Nov 14, 2003
2,287
1,234
Central MN
So does anyone have any thoughts on the 4080 12GB 4070ti?
If it had an FE version, I’d probably try to snag one. From what I understand, the performance should be around RTX 3090 but with updated/upgraded RT and Tensor Cores. Plus the energy efficiency increase due to the smaller manufacturing process.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
So does anyone have any thoughts on the 4080 12GB 4070ti?
So far I've yet to see any positive review, everyone is just crushing nvidia

Linus does a good job at explaining what's good with the card, but horrible with the business practice

GN goes full bore hate on this card
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.