• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Now that the new generation is out, are nVidia and AMD still manufacturing last gen video cards?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
First of all, do those consoles actually output that much bandwidth or do they just support the spec? We're simultaneously having a discussion about why "no card can hold steady 4k 60Hz/fps" (HDMI 2.0) and a certain brand of AV equipment has limited bandwidth and inadequate 4k 120Hz/fps (HDMI 2.1) implementation. If the first claim is true the second doesn't matter. Unless you're trying to imply the consoles outperform PCs based only on output spec. Remember all cards since 30 at least are supporting the same output spec even when they can't hit those frame rates. To my knowledge, the amount of eye candy rendered into a frame doesn't impact it's size (i.e) bandwidth as it's transmitted to the display, just the resolution matters at that point.

Honestly this whole thread just feels like a string of non sequiturs for the sake of being argumentative. You never really address a point or question, you just say, "now you get it" if someone does the work of making your point for you. Otherwise you just bring up new things that are relevant tangentially at best. We started at video cards being to expensive and now we're talking about AV receivers fibbing about HDMI standard.
 
The PS5 currently tops out at 32gbps when displaying 4k/120 HDR 4:2:2
The Xbox Series X tops out at 40gbps when displaying 4k/120 HDR 4:4:4
The typical higher graphical end PS5/XSX game would struggle to give 60 fps in quality mode, many closer to 30fps. You have to drop down to performance mode for 60fps. Only way you might see 120fps out of current gen consoles is if you run basic games that aren't particularly graphically demanding.

As for video cards, why would you even mess with those high end in-game settings unless you have the equipment that can display what the card is putting out?
You could argue any half reasonable GPU can out put out high end graphics. The difference is how fast. That is the balance that needs to be chosen, through hardware or wallet.
 
As for video cards, why would you even mess with those high end in-game settings unless you have the equipment that can display what the card is putting out?
As far as I've seen, most people think that the highest settings is how the game is supposed to be played, they have no idea that those settings are way over what is intended. Normal/Average (or High if you got performance to spare) is what you're supposed to use, depending on the game. RT is a separate setting that, again, you add on top of the other settings to make the game "prettier". One good example is Word of Warcraft, the normal/average is 5, high settings is 7, anything over that was made for (and I'm quoting the devs) "very high-end systems or SLI", but everyone simply cranks it to 10 and then complains of low FPS.
 
As far as I've seen, most people think that the highest settings is how the game is supposed to be played, they have no idea that those settings are way over what is intended.
I have to disagree here. Anything ABOVE default Ultra settings is what I would consider 'over what's intended'. But Ultra (or w/e they name the highest default setting) is what the devs intended. They have lower settings for systems that can't run adequate FPS at the given canned setting. Some have a "crazy" setting above ultra that I also feel fits in the 'over what's intended' category.

If you take away money (and esports people/titles), the goal is to have the game look its best while achieving w/e adequate FPS is for that user. That's the game people play, right? The only users who intentionally lower settings does so because they have to... either to get more FPS or to have a competitive advantage. Are there people who lower settings for other reasons? Like, why would I lower IQ settings if I'm getting the FPS I want/need? If so, I don't understand that thinking/logic.
 
Last edited:
I think both sides have a point. Some titles I have turned down the settings and hardly noticed a difference. Other titles certain settings seem to make a huge difference. I think there is definitely stuff that gets put in there because the GPU manufacturer wants to show off their new features (RT is the most obvious example but I think it happens with settings that don't garner as much attention). If you're a dev you're going to be pressured by your partnership with a GPU manufacturer to put stuff in there just to make people who got the most expensive cards feel that it's useful. To qualify I haven't played a lot of titles, but I think Cyberpunk is an excellent example of this, designed to be a GPU killer and motivate people to buy GPUs, but from what I've heard it wasn't a really fun game once you got over the eye candy. On the other side (I know it's an aging refresh of an even more aged game), but Doom Eternal was never difficult to run, and still looks pretty darn good (maybe I just have horribly low standards, but if so it only helps my wallet).

All of this to say, I'm still gaming on a Vega64 with a mid-range 1440p panel. Maybe if I had a 4090 and an LG OLED I would have a different experience. :unsure::LOL:
 
@EarthDog I see your point, but what they want to show and that the player can play are usually 2 completely different things :-/

@Zerileous Exactly my point, what do you call it, diminishing returns? There's a certain quality level (wholly dependent on the user) that anything above is just stressing your GPU (+watts/+temp) for no reason :shrug: There are tons of videos on YouTube that show that most games of the "eye candy" sort have settings that can be turned down all the way to low and still see next to zero difference in image quality while gaining some FPS. Also, as you stated, some companies like to make GPU killers, others like to make games that everyone can play (Doom eternal is an amazing example), and to be fair, some don't look that much different from the "eye candy" type.

I got my hands on Days Gone a few days ago, and boy oh boy, this game runs hot, even with my 3070 undervolted. But there is next to zero difference visually between High and Very High (to me), although it's on average 20fps-40fps faster and ~5c cooler with High settings :p
 
Right, adding in 'feels' or what settings do or don't to IQ and performance changes things. At a high level, (most) people want the best IQ they can get at (what they deem to be) acceptable FPS.... that's my point. If someone is enthusiastic enough to play with settings and find what looks (significantly/worthwhile) better or not or yields more performance - that's good on them, but I'd guess most aren't comparing too detailed. I fully believe devs start at Ultra and work their way down. It's easier to remove features or the intensity of one than it is to start with nothing and add it later. I can't support that (I'll ask a buddy that's an indy dev but worked with larger shops), but, it makes sense considering how people act/set their settings (info sourced from my hunches and confirmed in a TPU poll).

That said, even integrated graphics can run a slew of titles @ 1080p/ultra/60. With that, it stands to reason few users wouldn't be able to run 1080p/60/ultra (where 1080 is, BY FAR, still the most used resolution) with a discrete solution (even the 1660 or w/e can do 1080p/60/U). When you start getting to 1440p/60/U or better 4K/60/U (to be clear, we're not adding RT onto this.......), I lean more towards agreeing with that sentiment. But for 1080p, what the devs want to show and what the player can play are typically the same when shooting for 60FPS/Ultra. :)

But yeah, what changing the settings does and looks like, while a part of the convo, adds another variable to it all.
 
Last edited:
I'm kinda interested to see how "Last of Us" is gonna turn out for PC, coming out in a couple of months I think. Devs said they were using all the latest tech including RT and DLSS/FSR, and considering how good it was when it came out some... 8y-9y ago(?), it likely will be another GPU killer... It's supposed to be visually much better than the PS5 2022 remake, here's to hoping, been waiting for that game for a loooong time now :rain:
 
Back