• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Now that the new generation is out, are nVidia and AMD still manufacturing last gen video cards?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

c627627

c(n*199780) Senior Member
Joined
Feb 18, 2002
So has production of AMD 6xxx models of video cards stopped now that 7xxx models are out?

Has production of nVidia 3xxx models of video cards stopped now that 4xxx models are out?


Just trying to figure out long term where this pricing game they are both playing, is heading to....
 
From your last thread on this subject and CPUs...I'll say it again... if you're ready to buy, buy. Prices will only see a notable drop once the mid-life cycle upgrade cards come out, really. No idea when that is as both AMD and NV are still filling out the lineup with the budget-focused cards.

They resold 16xx series cards late last year...so it depends. I don't know if PRODUCTION stopped on any of these or if they are running on leftovers...I'd guess the latter unless there's a special fun for a 'reason' (like the 1600 series cards... they were the most popular on Steam).



EDIT: There's things like this around - https://www.pcmag.com/news/nvidia-mulls-continuing-rtx-3000-sales-even-after-next-gen-gpus-launch


... but that was a precursor to the previous discussion and Q3 being what we saw last year. With the wording of that article, production/sales seems typical to stop. But, not certain if they have. That's also nearly a year ago. I really don't see next gen cards being a significant drop/market correction either. It's a try to let go of your ankles situation, but... feels like our hands our glued unless, like you said, you don't buy or buy 2 gen old cards.
 
Last edited:
Do we know if the high end cards that are supposedly selling out, do we know if that is due to low supply?
That would be market manipulation, rather than them doubling as companies in size due to massive profits.


The scary part is that the 2024 generation, next generation, will be even faster and not by a little, and fear is accordingly priced even higher.
So rather than being replacements, with the new generation of high end cards, they are basically adding a new category of cards - the super expensive highest level of cards, a category that did not use to exist before. 4K Cards.


I was away for a long time from computer builds, and still am away, but I've just begun to look at comparison reviews and things come down to performance on highest resolution OLED TV settings.
Not even the most expensive cards can do highest settings today. Even AMD 7900 XTX can't really do 4K. First card that can be looked at for that, is the nVidia 4090, and it is a two thousand dollar card today (!).

I have started to learn about models and only the AMD 6800XT makes sense, it could be gotten for around $500 nowadays when caught on super sale. But it is a 1440P card, not for OLEDs. Anything higher does not make sense. None whatsoever. Price-wise or performance wise.

It was 20 years ago when this happened:

We could overclock these to make computers finally both affordable and usable. Computers were s-l-o-w before that. They finally became usable twenty years ago.
That has not happened yet for video cards and 4K OLED TV resolution.


So for me it is just not time to buy yet. I am sitting 2023 out.
2024 Cards will be the first to really do 4K OLED properly.
But at what cost... :(
 
Last edited:
There is no problem with availability, but prices stopped dropping a while ago. RTX3000 is still manufactured, and even GTX1650/60 is back in the new series (Biostar had news about it last month).
RTX3000 is not selling well anymore, RTX4000 is selling pretty badly due to its high price, and most gamers didn't want to wait more so they bought the last gen GPUs after the price dropped earlier last year (then RTX3080/3090 was selling the best as I remember). The second-hand market filled the gaps with ridiculously low prices on actions, but good luck with cards after mining.
Even worse looks AMD with the new series. I don't know if they still manufacture RX6000 as I can't see them available in most stores like some months ago, but the RX7000 has already started to collect dust in most stores. It's because the high price and news about overheating and problems with reference coolers didn't help too. It doesn't mean that these GPUs have real problems, but you know how "news" spreads on the web.
 
Somebody posted this and it sums up how gamers feel:

"If you have a 3070 or better, you've got everything you need. Frankly, I'm not sure there's going to be any really good reason to upgrade for a very long time.
For most of us 4K is overkill and demands a costly CPU and ample RAM to keep from bottlenecking your graphics. QHD (2560x1440) is more than adequate for most of us and your mid-range chip will handle it just fine, plus there are a lot of really nice, affordable monitors in that resolution. The vast majority of gamers are way more interested in FPS and smooth loading of world data than near-photorealism."



With AMD RX 6800 XT beating NVIDIA GeForce RTX 3070 Ti, that really makes the 6800XT a good choice today.
 
Last edited:
I Agee with most of that sentimemt in quotes (4k needs LESS of a processor at 4k than 1080). Unless you're into high hz/fps gaming or want the latest RTX tech, midrange last gen is plenty for 1080 60/120 at least. The hardware far outpaces the games at this point (and the last few gens, really)

That said, 4k/60 today doesn't turn into 4k/120 next year or beyond. Games ARE getting more difficult to run so 4k/60 now may be 4k/30 on some of the latest titles...


Even AMD 7900 XTX can't really do 4K.
2024 Cards will be the first to really do 4K OLED
What do you mean by this? Or the second thing...OLED or LCD, 4k is 4k...

There have been 4k/60+/Ultra cards out for three generations now.
 
Last edited:
Agreed with your last sentence there, Joe. 4K/60 has been available. Add long as you don't plan to use ray tracing. I believe the 4090 is the first card to consistently have over 60fps with 4k ultra + RT
 
If I needed a high-end card today I would be looking at the RX 6950 XT. The original msrp of the RX 6800 XT was $649 (not that you could get one at that price in 21 or most of 22). The original msrp of the RX 6950 XT was $1100, but the reference model is currently available and selling for $699. That seems to be a pretty good deal for the level of performance. Some of the AIB OC versions are also out there for under $800 too.
 
What is the performance criteria being used here? I have a 3070, and I'd consider that an entry level 4k60+ GPU. By that I mean, I don't expect max settings. "High" is usually doable, and the only trouble I have is with some older games that lack newer features to help improve performance. I never got one myself but I think a 3080 would be fine for 4k60+ "high".

If RT is at all a consideration, be wary of AMD performance, or lack thereof. If I were to upgrade from my 3070 today, I'd be looking at the 4070Ti or 7900XTX as the realistic options.
 
What do you mean by this? Or the second thing...OLED or LCD, 4k is 4k...

There have been 4k/60+/Ultra cards out for three generations now.
First of all this quote has provided an excellent counter-argument against me, and I now fully understand that this is reality, and not what I thought before I read it:

"If you have a 3070 or better, you've got everything you need. Frankly, I'm not sure there's going to be any really good reason to upgrade for a very long time.
For most of us 4K is overkill and demands a costly CPU and ample RAM to keep from bottlenecking your graphics. QHD (2560x1440) is more than adequate for most of us and your mid-range chip will handle it just fine, plus there are a lot of really nice, affordable monitors in that resolution. The vast majority of gamers are way more interested in FPS and smooth loading of world data than near-photorealism."



Having said that, 4K is absolutely not 4K.
LG made OLED TV is the most expensive, best, finest display technology readily available to us.
Sony OLED is just licensed LG with unnecessary better speakers, unnecessary because you would want full scale seven speaker set connected to your OLED, why would you even think about using internal TV speakers....

microLED TVs are available for like eighty thousand dollars today but later in this decade microLED will replace OLED as the display champ.

There is only one card that can even be talked about approaching being able to handle real 4K but it costs two thousand dollars.
This one ↓
I believe the 4090 is the first card to consistently have over 60fps with 4k ultra + RT

If we are forking over north of $ half a grand, then cards better be able to do real 4K.
Next year, AMD 8xxx models should. Here's hoping.

Remember when computers cost six grand [of those dollars]? That's a reminder of where we are with video cards.
Then $49 Athlon XP Thoroughbred B came out.
Then Mobile Athlon XP Barton came out which we used on desktops too.

Nothing like that has happened yer with affordability of video cards.
But if we are paying more than five hundred, they better do real 4K.
Only one barely can today.
 
What?

4k is a resolution. If you're implying that ultra settings and RT are required to enjoy a higher end OLED or microLED panel, that's subjective and opinion. It's fine to have that opinion, but it would behoove you to clarify. I am not aware of any technology in between the graphics card and the panel that makes the 4k output from a 4090 more or less suited to those panels than the 4k output from any other card.

If you're implying that DLSS is not real 4k, well that's an argument, but it has nothing to do with the panel. It's more of a quality setting. Most of these less than 4090 cards can do 4k high, or 4k ultra RT DLSS on. Of course AMD struggles with both DLSS and RT, but really are they that big of a deal. Forking over half a grand is mid range GPU territory, they're not suited to any 4k, much less "real 4k" whatever that means.

Also, a $130 Ryzen 5600G CPU can play games at 720p with integrated graphics, what do you mean were not there on GPUs? The expectation of FPS and resolution keeps going up. Getting the job done simply is cheaper than ever, but now everyone wants more better. My *$90* Athlon XP ran 1280x1024 at 20 fps (with a GPU of course).

If you want it cheaper then don't buy it. Yeah nVidia is famous for profiteering. AMD isn't some faithful underdog, they'll hike a price when their product supports it. Intel has skin in the game too, but at the end of the day, yes competition will help but all three companies are looking for profit.
 
If I needed a high-end card today I would be looking at the RX 6950 XT. The original msrp of the RX 6800 XT was $649 (not that you could get one at that price in 21 or most of 22). The original msrp of the RX 6950 XT was $1100, but the reference model is currently available and selling for $699. That seems to be a pretty good deal for the level of performance. Some of the AIB OC versions are also out there for under $800 too.
Yeah but
6800XT sale is around $500
6950XT sale is around $700

40% more cost for the 6950 XT but not as much performance gain?

Then it's that web of paying north of $700, and realizing 7900 XTX should really be that price, but it's not - so you look at 7900 XT which is only a hundred more when on sale than 6950XT and 7900 XT is the latest generation and then you realize how much worse 7900 XT is than 7900 XTX and you realize you're going head first toward spending a thousand dollars out of pocket after tax.. on a video card.


And that's when sanity kicks in and you get the 6800XT for five hundred, which is a card that is really better than the card marked as "sufficient" in that gamer's quote, the nVidia 3070.

And so with AMD 6800XT being better than nVidia 3070.... That's why 6800XT.
Post magically merged:


This true ultimate comparison details why the RTX 4090 is the only one, but the RTX 4090 will be eclipsed next year by cards that will significantly outperform the nVidia's two-thousand-dollar-flagship card of today, the RTX 4090.

Anyway, proof:

EDIT: Forward to 36 minutes 40 seconds.

And at 37:19:
"Even the 4090 was struggling just to hit that 60 FPS mark without upscaling or turning down settings."

 
Last edited:
Here's the 8 cheapest RX 6800 XT that are currently in stock on newegg... If you can find the $500 card, yes it's the better deal.

8InStock.png
 
You are correct, if buying today.

To get these expensive cards for less, you've got to sign up for one of those Hot Deal Forums alerts, then pull the trigger immediately after getting the alert on where it's on temporary sale. They will be gone quickly if you wait.
You have a low chance of just randomly stumbling onto a deal like that.
There may be rebates involved, I remember seeing newegg deal on the 6800XT had a rebate.

So - not for everyone.
 
Last edited:
Anyway, proof:

EDIT: Forward to 36 minutes 40 seconds.
average-fps-3840-2160.png

This is 4k ultra settings across 20+ titles. I dont believe it includes RT... but that's an option available in but a few titles. Many would say the differences in a lot of titles doesnt jump out at you (agreed). Most wont, and shouldn't, blindly crank RT up...along those lines, there's also DLSS you can use to speed things up that goes hand in hand with RT...again you'll have to squint to see a difference in IQ. But if you're just blindly cranking RT in those few titles with no regard for scaling and worth......yes a title or two falls short.

The bottom line is, 4k/60/U has been around for generations... especially without RT. Theres always a gpu killer (cyberpunk?) Thats an exception, lol. Over half of these titles are 40k/120/u.

Not sure what your saying about 'real' 4k. OLED/micro/crt...it doesn't matter....a gpu still has to render 3840x2160 pixels. The ray tracing/dlss is a technology in the graphics card and the game that ANY 4k monitor will display...oled/micro, irrelevant. I'm afraid you're barking up the wrong tree or were not understanding what you're saying.
 
Last edited:
Even with the additional posts I'm still unclear what would be the performance criteria that is being used. Absent that, it just sounds like complaining that new GPUs are expensive which isn't something we can really help with. "next year" cards might appear towards the end of 2024 based on recent cycle lengths. Ao next generation that might be out in nearly 2 years is faster than the ones just released? Predicting that must have been difficult. If you need something now, get something now. Don't expect the options to get much better. Just be clear what your performance expectations are. BTW still generally happy using a 3070 with my LG OLED B9. I would like faster but not enough to pay for it yet.

Forgot to reply to an earlier point on supply vs demand. The situation now is very different than during the previous shortage. It is difficult to scale production up to meet greater than expected demand. However this economic slowdown was more easily predictable, and they are trying to right size production to satisfy the market. They wont intentionally over-produce causing a price crash, but they don't want to be too short either. People can't buy what isn't there, and the competition could get the sale. About the only hope for reduced pricing is if AMD decide to push for market share, but they don't have the product or the pricing to drive that. Feels like they're happing making more profit on lower consumer GPU volumes, and putting their resources into other more profitable areas.
 
The performance criteria today, imho is how high FPS and smooth loading of world data is under 1440p.
Meaning on 2560x1440 monitors.
It is just fine using less expensive cards.


What we were talking about is performance of more expensive cards on overkill 4K.
People are often "4K is 4K", but no, my $200 LG 4K TV is not the same as my $2,000 LG OLED TV, there are additional features and settings to 4K that make the picture "better."

So the point was that under those extreme settings, those overkill, unnecessary settings, only one card can maybe perform, maybe. Then Janus67 said he believes that nVidia's 4090 card is the first card to consistently have over 60FPS under some of those higher settings. [That's why if Janus67 were in a boy band, he'd be "the smart one."]

Then EarthDog pulled a graph out that attempts to argue that 4K is 4K and that it has been around for generations, missing the point that 4K is not 4K... Only one card can (maybe) handle the extreme settings, maybe. The 4090. It's the first. We're not arguing whether it's worth it. It's just that if it costs two grand, it should be able to do it.
 
Remember when computers cost six grand [of those dollars]? That's a reminder of where we are with video cards.
Then $49 Athlon XP Thoroughbred B came out.
Then Mobile Athlon XP Barton came out which we used on desktops too.
I don't believe AMD Athlon XP Thoroughbred B CPUs were that cheap when they first came out. However, they were a couple years later. I bought an AMD Athlon XP 2800+ Thoroughbred B CPU at launch in 2002 and paid $400 for it.
Forgot to reply to an earlier point on supply vs demand. The situation now is very different than during the previous shortage. It is difficult to scale production up to meet greater than expected demand. However this economic slowdown was more easily predictable, and they are trying to right size production to satisfy the market. They wont intentionally over-produce causing a price crash, but they don't want to be too short either. People can't buy what isn't there, and the competition could get the sale. About the only hope for reduced pricing is if AMD decide to push for market share, but they don't have the product or the pricing to drive that. Feels like they're happing making more profit on lower consumer GPU volumes, and putting their resources into other more profitable areas.
For the last couple years chip manufacturing has been hampered by not enough capability compared to demand. I don't see that change for at least five years. It doesn't make any difference what quantities AMD and Nvidia want to produce or what customers want to buy. Until enough new fabs come online to meet manufacturing needs supplies will remain tight.
 
I bought an AMD Athlon XP 2800+ Thoroughbred B CPU at launch in 2002 and paid $400 for it.
Foolish decision. A mistake of gigantic proportions. LoL :)
What you should have done is gone to the AMD CPU section of ocforums where I was on, 8 hours a day in 2003.

I would have told you to under no circumstances buy the 2800+.

I am looking through my receipts, it seems I bought Thoroughbred B 2100+ for $62, proof below. I remember the $49 sale.

The 2800+ had next to no overclocking headroom, whereas the CPU below, seven times cheaper, overclocked routinely to 2.3 GHz and beyond. EASILY bypassing the 2250MHz frequency of the "top of the line" 2800+, the most expensive Throroughbred B ever released. It had no overclocking headroom. This was such a waste of money, MisterEd! :)
LoL

CPU Athlon XP 2100+ Thoroughbred B.jpg
 
Last edited:
Originally posted by c627627, 20 years ago
Overclockers exploit the manufacturing process knowing the goal of the manufacturer was to have all CPUs being capable of running as fast as the line's fastest processor plus additional headroom. Only those CPUs that pass rigorous tests at default voltages, are given labels toward the end of the line. Others are labeled not just according to tests, but according to marketing plans.

So as you can see:

Thoroughbred B's scale to 2800+.

We noticed that the greatest overclockers were 1700+ and 2100+ T-Bred B's capable of reaching those end of the line speeds of 2800+ "plus additional headroom."

That's why it's best to buy those CPUs and not higher labeled CPUs since $ is another factor in overclocking.

...and that's why 1700+ T-Bred B should scale not too far below any 2x00+ T-Bred B using the same equipment.
 
Back