Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,724
Georgia
Neat but I'm more the market of medium range cards. Not sure if I'm willing to wait for the mid range to come out. Since those probably won't be for a while. With the 4080 starting at 12GB. I'm concerned the mid range cards will still be 8GB. Which is unnacceptable to me.

Right now I'm considering an Rx 6700 XT to replace my Rx 580 8GB. I'd like a 4060 - 4070 or 7600 - 7700. But those likely won't be around until early next year. Not that it really matters. I should be able to mod GTA V and Skyrim with extreme detail mods and max out the setting with the 6700 XT. I'd just like better Ray Tracing, to try it in some games, and the 3060 Ti has too little VRAM.
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
Neat but I'm more the market of medium range cards. Not sure if I'm willing to wait for the mid range to come out. Since those probably won't be for a while. With the 4080 starting at 12GB. I'm concerned the mid range cards will still be 8GB. Which is unnacceptable to me.

Right now I'm considering an Rx 6700 XT to replace my Rx 580 8GB. I'd like a 4060 - 4070 or 7600 - 7700. But those likely won't be around until early next year. Not that it really matters. I should be able to mod GTA V and Skyrim with extreme detail mods and max out the setting with the 6700 XT. I'd just like better Ray Tracing, to try it in some games, and the 3060 Ti has too little VRAM.
Yeah it is a hard decision, AMD cards are basically trash at RT (compared to nvidia). You could always get a 30 series card.
I am not sure how the market is going to take the 104 die (which was pushed down the stack to x070 line with the 30 Series) being pushed back up the stack with a price increase (even though the performance did increase with respect to TGP). Heck the 4080 16GB is even using a different die than the 4090 (103 vs 102). I wouldn't be surprised if the 4070 is either a more gimped 104 die, or if they wait till the 106 die is ready and go from there. Really it is going to depend on what RDNA 3 comes up with.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
I just started seeing YTs on this, and one of them mentioned that there's two 4080s a 16gb and a 12gb It seems the smaller one is there only to make the case that the price point is comparable to the 3080 MSRP but in reality, its more a of 4070, as it has a lot less cores, not just vram.

TL;DR, Nividia is raising the prices and trying to fool consumers that the 40 series cards are a great deal.

I have a lowly 2060 super, and I may buy a 3070 now, just because I'll not want a 40 series and this will due me for a very long time.
 
  • Like
Reactions: Donfor39

belvdr

macrumors 603
Aug 15, 2005
5,945
1,372
We bought a refurb Alienware laptop with a 3080 and it will last us quite awhile. I don't game much any more.
 
  • Like
Reactions: Donfor39

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
I just started seeing YTs on this, and one of them mentioned that there's two 4080s a 16gb and a 12gb It seems the smaller one is there only to make the case that the price point is comparable to the 3080 MSRP but in reality, its more a of 4070, as it has a lot less cores, not just vram.

TL;DR, Nividia is raising the prices and trying to fool consumers that the 40 series cards are a great deal.

I have a lowly 2060 super, and I may buy a 3070 now, just because I'll not want a 40 series and this will due me for a very long time.
It is even worse, DLSS 3 won't work on prior cards, so the frame gen tech that can double frame rate is only available with the 40 series cards.

And yeah the 4080 12GB is basically a 4070 if you go by 30 Series die allocation.
 

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,724
Georgia
I just started seeing YTs on this, and one of them mentioned that there's two 4080s a 16gb and a 12gb It seems the smaller one is there only to make the case that the price point is comparable to the 3080 MSRP but in reality, its more a of 4070, as it has a lot less cores, not just vram.

TL;DR, Nividia is raising the prices and trying to fool consumers that the 40 series cards are a great deal.

I have a lowly 2060 super, and I may buy a 3070 now, just because I'll not want a 40 series and this will due me for a very long time.

Yes, I don't get that naming. At least make them a 4080 Ti and 4080 or 4080 and 4070 Ti/4075. Because the difference between the two 4080s is quite substantial.

Yeah it is a hard decision, AMD cards are basically trash at RT (compared to nvidia). You could always get a 30 series card.
I am not sure how the market is going to take the 104 die (which was pushed down the stack to x070 line with the 30 Series) being pushed back up the stack with a price increase (even though the performance did increase with respect to TGP). Heck the 4080 16GB is even using a different die than the 4090 (103 vs 102). I wouldn't be surprised if the 4070 is either a more gimped 104 die, or if they wait till the 106 die is ready and go from there. Really it is going to depend on what RDNA 3 comes up with.

I've looked at the 30 series and 8GB VRAM is DOA to me, 12GB is my minimum and the 3060 isn't enough of an upgrade to bother with. The Rx 7000 series is purportedly going to get massive ray tracing upgrades. But next year is too long to wait. Nor do I want to risk another crypto cycle. Yea, Ethereum is dead for mining but there's a whole bunch of others which become profitable to mine. When there is a price rebound.
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
Yes, I don't get that naming. At least make them a 4080 Ti and 4080 or 4080 and 4070 Ti/4075. Because the difference between the two 4080s is quite substantial.



I've looked at the 30 series and 8GB VRAM is DOA to me, 12GB is my minimum and the 3060 isn't enough of an upgrade to bother with. The Rx 7000 series is purportedly going to get massive ray tracing upgrades. But next year is too long to wait. Nor do I want to risk another crypto cycle. Yea, Ethereum is dead for mining but there's a whole bunch of others which become profitable to mine. When there is a price rebound.
How much are you looking to spend? What resolution do you run your games at?

I wouldn't expect AMD to catch Lovelace WRT RT performance. I am wondering if it would be prudent to wait on the 7950XT (which will likely drop next year). I don't have many RT games, but there is a noticeable drop (I play at 3440x1440) when I do turn it on. Stray drops from 144 fps to like 20 when dx12 mode is forced.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
The more Nvidia is in the news the more greedy they seem be, the two 3080s are just another example

Too bad, Intel is just unable to get out of their own way and produce a quality competing GPU, and AMD is close - so close I may consider them.

I'm not looking for 40xx performance but something inexpensive to markedly improve over my 2060. For the games I play the 2060 is fine, which is why I may sit this out and wait for the next generation in a few years.
 

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,724
Georgia
How much are you looking to spend? What resolution do you run your games at?

I wouldn't expect AMD to catch Lovelace WRT RT performance. I am wondering if it would be prudent to wait on the 7950XT (which will likely drop next year). I don't have many RT games, but there is a noticeable drop (I play at 3440x1440) when I do turn it on. Stray drops from 144 fps to like 20 when dx12 mode is forced.

$400 is about as far as I'm willing to go. Which gets me the 6700XT. The next step up is just too much a price increase vs too modest a performance increase. The rumors I've read is RDNA 3 in the 7700 XT is expected to beat the 6900 XT in Ray Tracing. They probably won't beat the 40 series. But it's not like I need a ton. As I tend to play games once they've been out 4/5 years. The only games I really care about this stuff are ones I can mod a lot in single player.

The only games I'll probably buy new in the future is Sims 5, Elder Scrolls 6 and GTA 6. Unless they gimp modding support. But I don't think those are due out for a few years. So, I'll be in the market for a new GPU (and computer) by then. I've only waited this long because of how crypto destroyed the GPU prices. Otherwise I'd have upgraded over a year ago when I built my current computer.

Other story driven games I can't mod just need to look decent enough. As I blow through them and never play them again. Heck, about 90% of the games in my Steam library. I only played for an hour or two before flagging them as Sucks to hide them and never play again. That's why I mostly buy older games. So, I can get them for $2 to $10. As odds are I'll hate them.

There's very few games I buy new. I'm sure a 4080 would be awesome. But not $800-$900 awesome.
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
$400 is about as far as I'm willing to go. Which gets me the 6700XT. The next step up is just too much a price increase vs too modest a performance increase. The rumors I've read is RDNA 3 in the 7700 XT is expected to beat the 6900 XT in Ray Tracing. They probably won't beat the 40 series. But it's not like I need a ton. As I tend to play games once they've been out 4/5 years. The only games I really care about this stuff are ones I can mod a lot in single player.

The only games I'll probably buy new in the future is Sims 5, Elder Scrolls 6 and GTA 6. Unless they gimp modding support. But I don't think those are due out for a few years. So, I'll be in the market for a new GPU (and computer) by then. I've only waited this long because of how crypto destroyed the GPU prices. Otherwise I'd have upgraded over a year ago when I built my current computer.

Other story driven games I can't mod just need to look decent enough. As I blow through them and never play them again. Heck, about 90% of the games in my Steam library. I only played for an hour or two before flagging them as Sucks to hide them and never play again. That's why I mostly buy older games. So, I can get them for $2 to $10. As odds are I'll hate them.

There's very few games I buy new. I'm sure a 4080 would be awesome. But not $800-$900 awesome.
6900XT's may be around 400 when 7000 is launched. Maybe. I do wonder if AMD will wait till next year to drop the 7700 and lower line up since nvidia seems to be sticking to the high end this year.
 

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,724
Georgia
6900XT's may be around 400 when 7000 is launched. Maybe. I do wonder if AMD will wait till next year to drop the 7700 and lower line up since nvidia seems to be sticking to the high end this year.
Probably, they want to sell as many of the flaghip cards as they can at full MSRP. I'm guessing it also takes a while for manufacturing to ramp up on the new architecture. Until they can get the volume they need for lower end models.

I'm willing to wait until mid October. To see if the 6800 or 6800 XT gets in my price range. I can't imagine a 3080 Ti would slip in there. I've got an 850W Seasonic PSU that ranks a 9.5/10 on jonnyguru. So, I'm good to go there. Although they're a bit overkill for my i5-11400.

I could also just toss another 580 8GB in I have laying around. But the last time I tried Crossfire. It wasn't very stable. So, I pulled the card.
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
3080ti and above will probably hover around 800+USD for a while. 6900xt's are around 700USD currently so another 300USD drop when the 7900xt is announced is possible (unless AMD prices the 7900xt really high).
 

vladi

macrumors 65816
Jan 30, 2010
1,007
616
4090 is a workstation dream if you are all about GPU workflow and I don't know why you wouldn't be.
 

CrpyticComedian

macrumors newbie
Original poster
Nov 8, 2021
5
0
The more Nvidia is in the news the more greedy they seem be, the two 3080s are just another example

Too bad, Intel is just unable to get out of their own way and produce a quality competing GPU, and AMD is close - so close I may consider them.

I'm not looking for 40xx performance but something inexpensive to markedly improve over my 2060. For the games I play the 2060 is fine, which is why I may sit this out and wait for the next generation in a few years.

Huang even tried to justify the price hike in a recent interview. Honestly Nvidia charges high prices because they know all the crypto miners will buy their cards regardless

 

c0ppo

macrumors 68000
Feb 11, 2013
1,890
3,268
I still use 5700XT. Don't really need to change it, since it is more than enough for me.
But these prices are insane. You can add 50%+ to the price to get a final price in my country.

I will stick with what I have. But when the time comes to upgrade, if the prices remain like this, I'm simply not upgrading. Gonna just use macbook.
 
  • Like
Reactions: Donfor39

russell_314

macrumors 604
Feb 10, 2019
6,640
10,228
USA

Huang even tried to justify the price hike in a recent interview. Honestly Nvidia charges high prices because they know all the crypto miners will buy their cards regardless

Well since so many cards were scalped and sold for double the price, Nvidia figured out that they can raise their prices. They could sell a 3080 type card for $1500 and people would buy as many as they could make. There would be no reduction in sales.

I don’t like it but that’s just reality. About six months ago or maybe a little longer I was in the market for a 30 series card but I couldn’t find one at anything close to MSRP. I ended up buying a prebuilt system. That system will keep me gaming for at least another two years.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Honestly Nvidia charges high prices because they know all the crypto miners will buy their cards regardless
I think that ship has passed, with ethereum on proof of stake, the GPU market has plummeted and the number of GPUs on ebay is ballooning.

I do agree with you, regarding Nvidia charging the price because they know people will pay it. Fir the first time in a long time, I'll consider AMD over Nvidia. I'm not a heavy gamer, and I'm not willing spend a lot of money
 

dmr727

macrumors G4
Dec 29, 2007
10,636
5,708
NYC
I do agree with you, regarding Nvidia charging the price because they know people will pay it. Fir the first time in a long time, I'll consider AMD over Nvidia. I'm not a heavy gamer, and I'm not willing spend a lot of money

I'm in the same boat, and think my next card will be from AMD just on general principle. I don't find the RT stuff to be game changing, and my CUDA programming is merely a hobby that can be refocused elsewhere. I switched to AMD for my CPU because of Intel shenanigans and haven't been disappointed - might as well give the GPU side of it a try. :)
 
  • Like
Reactions: maflynn

velocityg4

macrumors 604
Dec 19, 2004
7,330
4,724
Georgia
Well since so many cards were scalped and sold for double the price, Nvidia figured out that they can raise their prices. They could sell a 3080 type card for $1500 and people would buy as many as they could make. There would be no reduction in sales.

I don’t like it but that’s just reality. About six months ago or maybe a little longer I was in the market for a 30 series card but I couldn’t find one at anything close to MSRP. I ended up buying a prebuilt system. That system will keep me gaming for at least another two years.

What I'd like to know is how much of that was even nVidia or AMD? Because when scalpers started getting insane prices. The actual card makers ASUS, Powercolor, &c. Started releasing new SKUs with much higher MSRP. The old SKUs remained but only on paper. They just started shipping the new ones instead.

Did nVidia or AMD even get to sell the components at much higher prices or was it the card makers reaping the higher profit margins? They still did well, because they could sell everything they could make. I know nVidia also made their miner series and LHR series. Which they could jack up component prices on. But I'd think they'd have supply contracts with the card makers which would limit their own price hikes.

I'm just curious who actually reaped higher per card margins?

I think that ship has passed, with ethereum on proof of stake, the GPU market has plummeted and the number of GPUs on ebay is ballooning.

I do agree with you, regarding Nvidia charging the price because they know people will pay it. Fir the first time in a long time, I'll consider AMD over Nvidia. I'm not a heavy gamer, and I'm not willing spend a lot of money

For now. There's plenty of other crypto coins which were profitable. Ethereum was just generally the most profitable. If prices go through the roof again. People could just mine something else.
 

th0masp

macrumors 6502a
Mar 16, 2015
848
514
I need the Nvidia's for work so they are the only game in town for me anyway. I find the VRAM sizes a real letdown for anything below the 4090. As it is only the 16 GB 4080 and up would present an upgrade for me but I'll be waiting for benchmarks since main focus for work is the raster performance and those improvements haven't been too impressive these last few generations.

And let's see about what the prices look like in a few months when all the 3xxx card inventory will have cleared. With no crypto boom around this time I hope these prices will come down swiftly. I found some 3090's being offered for as low as 1100 euros recently.
 

maflynn

macrumors Haswell
May 3, 2009
73,682
43,740
Did nVidia or AMD even get to sell the components at much higher prices or was it the card makers reaping the higher profit margins?
Yes, both in prices and product. For reasons that only make sense if you consider it a money grab - Nvidia these past two years, rereleased old legacy cards that were over priced instead of trying to improve inventory of existing GPUs. Many times they stopped even publishing the MSRP, so they could just bump up and gouge the consumer (not that the average consumer actually had a chance of buying one).

Imo, they made zero attempts at trying to stop the scalpers and miners, simply because they were able to sell their products by the pallet for a premium. They would much rather sell 4 or 5 pallets of GPUs to miners then sending through the normal distribution channels

Nvidia pays $5.5 million for allegedly hiding how many gaming GPUs were sold to crypto miners
NVIDIA Allegedly Sold $175 Million Worth of Ampere GeForce RTX 30 GPUs To Crypto Miners, Could Be A Contributing Factor Behind Immense Shortages
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
I need the Nvidia's for work so they are the only game in town for me anyway. I find the VRAM sizes a real letdown for anything below the 4090. As it is only the 16 GB 4080 and up would present an upgrade for me but I'll be waiting for benchmarks since main focus for work is the raster performance and those improvements haven't been too impressive these last few generations.

And let's see about what the prices look like in a few months when all the 3xxx card inventory will have cleared. With no crypto boom around this time I hope these prices will come down swiftly. I found some 3090's being offered for as low as 1100 euros recently.
Why wouldn't you go the Ax000 route? Usually those cards have a larger framebuffer and more cores than their consumer counterparts.
 

th0masp

macrumors 6502a
Mar 16, 2015
848
514
Why wouldn't you go the Ax000 route? Usually those cards have a larger framebuffer and more cores than their consumer counterparts.
I work in video games. Historically not all of the professional series stuff has even been properly supported across all the tools. Usually best for my use case to stick with the upper end consumer series.
 

diamond.g

macrumors G4
Mar 20, 2007
11,405
2,638
OBX
I work in video games. Historically not all of the professional series stuff has even been properly supported across all the tools. Usually best for my use case to stick with the upper end consumer series.
Ah, in that case, 4090 or bust! It isn't like they come out with a new architecture every year. Besides this time around the 4080 line is way more gimped than the 3080 line.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.