• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Now that the new generation is out, are nVidia and AMD still manufacturing last gen video cards?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Foolish decision. A mistake of gigantic proportions. LoL :)
What you should have done is gone to the AMD CPU section of ocforums where I was on, 8 hours a day in 2003.

I would have told you to under no circumstances buy the 2800+.

I am looking through my receipts, it seems I bought Thoroughbred B 2100+ for $62, proof below. I remember the $49 sale.

The 2800+ had next to no overclocking headroom, whereas the CPU below, seven times cheaper, overclocked routinely to 2.3 GHz and beyond. EASILY bypassing the 2250MHz frequency of the "top of the line" 2800+, the most expensive Throroughbred B ever released. It had no overclocking headroom. This was such a waste of money, MisterEd! :)
LoL

View attachment 360610
Worse than that. Look at my invoice. I don't have have a screenshot of the original invoice. Unfortunately, Newegg doesn't archive them more than ten years old. However, back then I copied the information from the invoice into a Word document. Here is a screenshot (with personal information removed) of that document.


You said "What you should have done is gone to the AMD CPU section of ocforums where I was on, 8 hours a day in 2003."

There are two problems with that.
1. I bought the Athlon XP 2800+ in 2002 the previous year. Since I was one of the first ones to receive it there wasn't anyone to say not to get it. Maybe if I would have waited six months I would have not bought it.

2. I didn't join this forum until 2004. By that time it didn't matter about the XP 2800+.

BTW, as you can see below I also bought an ASUS A7N8X Deluxe motherboard. That was probably one of the first AMD dual-channel memory motherboards. A lot of people at the time had problems with dual channel. That was why I was overly cautious and paid so much for the Corsair XMS RAM. I read it should work for dual channel. Note dual-channel memory was so new that dual-channel memory kits did not exist. Until they did some sellers were individually checking pairs of RAM so they could sell them together.

Invoice.jpg
 
Last edited:
People are often "4K is 4K", but no, my $200 LG 4K TV is not the same as my $2,000 LG OLED TV, there are additional features and settings to 4K that make the picture "better."

So the point was that under those extreme settings, those overkill, unnecessary settings, only one card can maybe perform, maybe. Then Janus67 said he believes that nVidia's 4090 card is the first card to consistently have over 60FPS under some of those higher settings. [That's why if Janus67 were in a boy band, he'd be "the smart one."]

Then EarthDog pulled a graph out that attempts to argue that 4K is 4K and that it has been around for generations, missing the point that 4K is not 4K... Only one card can (maybe) handle the extreme settings, maybe. The 4090. It's the first. We're not arguing whether it's worth it. It's just that if it costs two grand, it should be able to do it.
I'm still struggling to understand your view point. Would I be correct in saying that because you have an expensive 4k display, you'd expect better output from the system to drive it? I'm with EarthDog I guess. 4k is 4k to me. I had a £200 4k monitor. I have a £1000 OLED TV. Picture is better on the TV for sure, but my demands of the hardware driving it doesn't change between them.

For the last couple years chip manufacturing has been hampered by not enough capability compared to demand. I don't see that change for at least five years. It doesn't make any difference what quantities AMD and Nvidia want to produce or what customers want to buy. Until enough new fabs come online to meet manufacturing needs supplies will remain tight.
Without digging them up again, I think it was around end of last year there were multiple reports of Apple, AMD and Nvidia looking to reduce their production from TSMC as they were expecting the economic slowdown. I don't recall exactly who got what, but I recall AMD paid off TSMC and nvidia got a delay to their allocated production but no reduction. TSMC's position was if they got someone else to fill the reduced capacity they wouldn't get penalised as much. I didn't hear that happened. If the demand was there, doubt TSMC would take such a hard line.

The shortages in other areas are due to the mix of nodes available. I vaguely recall it is some of the older ones that are in constraint.

Of course this is only looking at the near term. Fab construction is a longer term optimisation.
 
MisterEd,

That Asus board was faster than most and had a PCI Lock.

Increasing the FSB changed the speed of the whole motherboard and everything connected to it unless the mobo had a PCI lock.
The following nForce2 boards had a confirmed PCI lock:
Epox, Abit, Asus, Chaintech, Soltek, MSI, Gigabyte.

Asus was fine, however, Soltek NV400-64 was a low cost single channel nForce2 which outperformed almost twice as expensive dual channel ASUS A7N8X.


Heatsink: all copper Thermalright $10-$15 SK6+ or $20-$30 SK-7.

But then svcompucycle.com had the best deal, the $15.99 SK-7. All copper heatsink.

Best deal on RAM was twinmos/ch-5 winbond pc3200

It allowed you to go to 220 FSB which was more important than raw MHz.


You should have gone with the $36 Fortron for a power supply. Model was FSP350-60PN.

I mean, what were you thinking, man!?! :)
LoL
 
Worse than that. Look at my invoice. I don't have have a screenshot of the original invoice. Unfortunately, Newegg doesn't archive them more than ten years old. However, back then I copied the information from the invoice into a Word document. Here is a screenshot (with personal information removed) of that document.


You said "What you should have done is gone to the AMD CPU section of ocforums where I was on, 8 hours a day in 2003."

There are two problems with that.
1. I bought the Athlon XP 2800+ in 2002 the previous year. Since I was one of the first ones to receive it there wasn't anyone to say not to get it. Maybe if I would have waited six months I would have not bought it.

2. I didn't join this forum until 2004. By that time it didn't matter about the XP 2800+.

BTW, as you can see below I also bought an ASUS A7N8X Deluxe motherboard. That was probably one of the first AMD dual-channel memory motherboards. A lot of people at the time had problems with dual channel. That was why I was overly cautious and paid so much for the Corsair XMS RAM. I read it should work for dual channel. Note dual-channel memory was so new that dual-channel memory kits did not exist. Until they did some sellers were individually checking pairs of RAM so they could sell them together.

View attachment 360611
Sorry for the completely OT, but I forgot how insanely expensive DDR RAM was back in the day.
I never had any BH-5 or CH-5 sticks, but I still have a TCCD OCZ DDR400 kit & two New-BH OCZ DDR400 kits.
 
People are often "4K is 4K", but no, my $200 LG 4K TV is not the same as my $2,000 LG OLED TV, there are additional features and settings to 4K that make the picture "better."
But 4K is 4K to a video card is our point. It doesn't matter if it's OLED or Micro... it still has to render the exact same number of pixels. What each TV does to make the picture better is a function of the TV, not the graphics card. In-game settings (as you seem to be alluding to?) don't change that fact. So, performance on a $500 4K UHD TV we'll be the same as on a $5000 4K UHD TV.

So much this...
Picture is better on the TV for sure, but my demands of the hardware driving it doesn't change between them.



So the point was that under those extreme settings, those overkill, unnecessary settings, only one card can maybe perform, maybe. Then Janus67 said he believes that nVidia's 4090 card is the first card to consistently have over 60FPS under some of those higher settings. [That's why if Janus67 were in a boy band, he'd be "the smart one."]
Those extreme settings. So, PAST Ultra's default settings and adding RT in the (few) titles that support it. Is that correct? I concede, under those circumstances, there is only one. A pellet from our shotgun blast hit!

But since we're adding features, can we use DLSS to bring those FPS right back up in those games? Same with 3090 Ti? Both of these would easily be 4K/60/U/RT cards.

Then EarthDog pulled a graph out that attempts to argue that 4K is 4K and that it has been around for generations, missing the point that 4K is not 4K...
But it is... Your definition of what defines 4K as 4K seems to be different than others in the thread so the ships passed in the night. But I think I understand you now...saying in-game settings make it different... ;).
 
The performance criteria today, imho is how high FPS and smooth loading of world data is under 1440p.
Meaning on 2560x1440 monitors.
It is just fine using less expensive cards.


What we were talking about is performance of more expensive cards on overkill 4K.
People are often "4K is 4K", but no, my $200 LG 4K TV is not the same as my $2,000 LG OLED TV, there are additional features and settings to 4K that make the picture "better."

So the point was that under those extreme settings, those overkill, unnecessary settings, only one card can maybe perform, maybe. Then Janus67 said he believes that nVidia's 4090 card is the first card to consistently have over 60FPS under some of those higher settings. [That's why if Janus67 were in a boy band, he'd be "the smart one."]

Then EarthDog pulled a graph out that attempts to argue that 4K is 4K and that it has been around for generations, missing the point that 4K is not 4K... Only one card can (maybe) handle the extreme settings, maybe. The 4090. It's the first. We're not arguing whether it's worth it. It's just that if it costs two grand, it should be able to do it.

Even though we are somewhat in agreement, you still are not listing the settings and criteria.

3840x2160 is (for all intents and purposes) 4K resolution. Whether your TV offers post processing or gsync (like my c1 does) it doesn't impact the FPS that you are displaying unless you are frame limited by your monitor. The newer LG OLED screens offer 120hz refresh rate as well as variable refresh rate for gsync or VRR, but that is outside the conversation/not the point.

My RTX3080 can play battlefield 2042 at around 90-100fps at 4k with settings pretty high up with DLSS balanced enabled, but without Ray tracing turned on (but why take any performance hit when playing a fast-paced multiplayer game?).

But in your post you also stated baseline criteria is 1440p, which cards have been able to do from the mid+end for at least a few generations consistently.
 
Even though we are somewhat in agreement, you still are not listing the settings and criteria.
The thread is more of a journey of discovery.
After starting it, I read that gamer's quote which made me think that I had eyes but I was blind.
He said 'who cares about photo-realism.... as long as the world data is loading and FPS are high.' He really made a good point that cheaper cards can easily accomplish this on 1440p, pointed to what really matters, and lowered the criteria to what is the cheapest, best card that can do that today.


But I think I understand you now...saying in-game settings make it different... ;).

Continuing our journey of discovery, Here's an educational story of why HDMI 2.1 is not HDMI 2.1, much like 4K is not 4K:

In the United States we have a chain of stores called Costco. You can't just walk in, you have to pay a yearly membership to walk in there and shop. I bought my receiver there, it is the Onkyo TX-NR6050.
Costco also had the Yamaha TSR-700. Direct competitors. Appear to be the same, both are HDMI 2.1.

But they are NOT both real HDMI 2.1.

Yamaha TSR-700 has HDMI 2.1 bandwidth capped at 24gbps, which is half of HDMI 2.1's full bandwidth.


The PS5 currently tops out at 32gbps when displaying 4k/120 HDR 4:2:2
The Xbox Series X tops out at 40gbps when displaying 4k/120 HDR 4:4:4


The Yamaha is not able to handle the top bandwidth of these new consoles!
So if you have a fancy OLED and just bought a new console and want the best picture quality available, Yamaha TSR-700 will be a weak link and you'll be getting a (slightly) worse picture than what the new systems are capable of outputting.

To get the best picture you should be looking for a 40gbps receiver.

Onkyo TX-NR6050 is a 40gbps receiver, not capped at 24gbps like Yamaha.


But you would only care about this if you have a two thousand dollar OLED 4K TV.
If you have a $200 4K TV, none of this really applies.

Even if your two thousand dollar TV is from 2018 it doesn't apply, because 2018 C8's had HDMI 2.0, not HDMI 2.1, so even though they can display spectacularly beautiful picture, 2018 OLED TVs have no HDMI 2.1 inputs.


Back to the Costco story.

Onkyo TX-NR6100 costs 60% more than Onkyo TX-NR6050 [Costco exclusive model].
60% more, $800 instead of $500.

Onkyo says that the difference between TX-NR6050 [Costco exclusive model] and the TX-NR6100 [available everywhere else] is THX certification and a 10 Watt power difference. 90 watts vs. 100 watts.


But certification is just a label on the TX-NR6100, they are both capable of it! Same components inside.
And in reality, the 10W difference is completely arbitrary, the actual inside components are 100% identical. TX-NR6100 costs 60% more, $800 instead of $500.

Likewise Yamaha TSR-700 [Costco exclusive model] is just a rebadged Yamaha RX-V6A [available everywhere else], which costs tons of $ more. Same components inside.
 
Last edited by a moderator:
None of that has to do with video card and FPS, though.

Those are image quality changes that depend on external hardware to the video card, being the cable, receiver (if using), and TV limitations. If given a 4k screen that offers high enough refresh rate (or just disable vsync and accept screen tearing above 60, for example), the benchmark results for a given card should be the same/within margin of error.

Now, will the HDR pop as much on a cheap LCD vs an OLED? No. Will colors be as accurate on a cheaper receiver? Not necessarily. But that doesn't change the FPS benchmark results between a 2080ti, 3090, 4090 assuming they have the port to offer 4K resolution and high enough fps/refresh rate.
 
I deleted a bunch of stuff because I didn't see the second page.

Okay all of the old hardware stuff is super interesting and a blast from the past. Do you have all that from memory or just super detailed notes? Or perusing old threads? I believe those AthlonXPs though could be multiplier unlocked with a graphite pencil, or maybe that was only the 1000 series. I got in a year or two later with the Barton core 2500+, and the Abit NF7-S allowed you to run a higher FSB and use the system divider specified for the faster 3200+ (IIRC), so you could FSB OC without any of the problems to your everything else.

Back on the topic at hand. I said earlier I wasn't aware of a technology between the GPU and the panel and you rightly pointed out that the cables do matter, as do the connectors being used, but I don't believe that pertains to the choice of GPU.

From the nVidia website, this is the output of the 4090
Up to 4K 12-bit HDR at 240Hz with DP 1.4a + DSC or HDMI 2.1a + DSC. Up to 8K 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1a + DSC.

and the 3080Ti
Up to 4k 12-bit HDR at 240Hz with DP1.4a+DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a+DSC or HDMI2.1+DSC. With dual DP1.4a+DSC, up to 8K HDR at 120Hz

aaaand the 3060Ti
1 - Up to 4k 12-bit HDR at 240Hz with DP1.4a+DSC. Up to 8k 12-bit HDR at 60Hz with DP 1.4a+DSC or HDMI2.1+DSC. With dual DP1.4a+DSC, up to 8K HDR at 120Hz

Since the thread was really about the last two generations, I didn't look at 2000 series. It also wasn't nVidia vs AMD and most of the attention has been on nVidia. There are also other specifications listed, but I chose the highest level (display port). If you must use HDMI that is what I would consider a niche use case. So not a really thorough explanation, but it doesn't seem to matter here in terms of what GPU you choose whether its 40 or 30, whether its a 90 or 60, they all give you the full spec! Sure it might matter for AV equipment, but that's a bit of a non sequitur to throw in on page 2 that you were referring to AV equipment the whole time.

Regarding the topic of game settings, I would argue, and this is based not on experience but opinion, (so if you have done the comparison I think most of the posters would have preferred you come out with it than this "journey" of a thread), that a higher quality of panel will enhance the appearnace of a game at all but the most stripped down of settings, rather than as you seem to imply, reveal the flaws of any but the highest settings. I could be wrong though. Your textures and stuff definitely need to have enough resolution to go with 4k, but I don't think a worse panel will make that look better than a nice panel.
 
Last edited:
It's just common sense of cause and effect: Who, by definition, would crank up the in-game settings, so that they are so high that only one card, nVidia's current flagship 4090, could handle them, if not struggling to handle them?

Owners of high end displays.
 
That's the goal, for most. To use the ultra settings (or better) and to match/beat your desired refresh rate. There's also competitive gamers or games who want the competitive advantage more fps and lower settings bring. So, it depends on the user and the game, but outside of esports, the ultimate goal has always been to use ultra settings. You go less if you have to (increase fps) or have a need to (esports/comp).

1080p, 4k oled, a vast majority of gamers want it to look as good as it can, monitor type be damned. Refer back to janus' post. I just depends on the user and game.
 
Look, what do you see on this picture, what's the difference? These are both multi-thousand dollar 4K TVs:


C9vsC8.jpg

However, C9 has HDMI 2.1 and C8 HDMI 2.0.

HDMI 2.1 represents a huge leap over HDMI 2.0, with 48Gbps in bandwidth compared to 18Gbps.

If you own a C8, what does it matter to you that

The PS5 currently tops out at 32gbps when displaying 4k/120 HDR 4:2:2
The Xbox Series X tops out at 40gbps when displaying 4k/120 HDR 4:4:4

C8 is not able to take advantage of these settings!
Even if you run your Xbox on a C9, you better not use that Yamaha HDMI 2.1 receiver, because Yamaha had the HDMI 2.1 bandwidth capped at 24gbps.

So you have to be using high end display, high end receiver to take advantage of 40gbps when displaying 4K/120 HDR 4:4:4.


As for video cards, why would you even mess with those high end in-game settings unless you have the equipment that can display what the card is putting out?
 
As for video cards, why would you even mess with those high end in-game settings unless you have the equipment that can display what the card is putting out?
Sorry, but what output, be specific, are you talking about here??? What High end in game settings, specifically, are you talking about???

Refer to janus' previous post (last paragraph)...
 
What current settings is he talking about at 37:19 when he says:

"Even the 4090 was struggling just to hit that 60 FPS mark without upscaling or turning down settings."


 
What current settings is he talking about at 37:19 when he says:
You don't know but are huffing and puffing?! It's literally only RT...which, is only available on a small percentage of titles (but still dozens and more quarterly), currently. Like was also mentioned previously, you can enable DLSS or Fidelity FX and gain FPS back with a typically negligible (read: hardly noticeable, especially in fast action) loss if IQ. RT is an option above and beyond 'ultra' settings.

I again refer back to one of janus' previous posts where he points out why you would want to use that in certain titles that lend to being 'better' with higher FPS. Full circle, it depends on the user and title as to what settings they are going to use. Most outside of esports are looking to get the highest IQ they can while matching/beating their refresh rate.

The point here is there's no magic setting you can't run. A 1080p potato shows off RT just as a 4k oled does. Again, IQ is different because of the panel/TV quality, but they can all run/display everything/the effect the developer intended.
 
Last edited:
I hate to ask this here as its off topic.

"Even if you run your Xbox on a C9, you better not use that Yamaha HDMI 2.1 receiver,"

What if the receiver is using the arc output from the tv?
 
For those gaming systems to output max settings, yes, you can always plug them directly into the C9 and use the eARC to send uncompressed sound to the receiver.

Yes, eARC eliminates the receiver as a bottle neck.


Kind of like with what we're doing here, those discussions always branch off to "I seriously doubt anyone will enjoy a video game more because it is running through a 40 gbps instead of a 24 gbps HDMI port."
As if that's the point.
 
When I was reading the Yamaha thread some guy complained about Yamaha "whenever I drop that cash the damn thing needs to work."
Another gamer used the words "useless for gaming" to describe the current Yamaha.
But current generation Onkyo HDMI 2.1 does work fully, It's just that Yamaha's does not.

When Yamaha's does not, some people offer explanations like "I seriously doubt anyone will enjoy a video game more because it is running through a 40 gbps instead of a 24 gbps HDMI port."
They don't care so they think nobody else should care either.
 
Back