• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

older rig: worth upgrading to Nvidia 4xxx series?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.


Nov 1, 2009
Austin, TX
ROFLMAO! joys of nerddom....

when i talk about performance i make it simple.... ROUGHLY is a very good word to use.... when i upgrade... i like to upgrade to ROUGHLY twice (this "percentage" is hypothetical. not actual) the Framerate (on the same CPU) as the current model.... so, if i'm getting 100fps+/- 10%, then i want to go to 200fps +/- 10%
besides, going exact needs per-game performance on the games i specifically play, which will get varying results based on the game.

my past upgrades:
Phenom II x4 965 black -------> i7 4930k , GPU stayed the same (2 GTX 670 SLI) = roughly double the performance
2 GTX 670 SLI --------> 2 RTX 2070 Super SLI NVlink (CPU stayed the same) = roughly double performance
i7 4930k ---------> Ryaen 9 5900x (GPU stayed the same) = roughly double the performance

so, if i can afford a 4090 upgrade, that will give me another roughly double the performance :)

i don't need to get into specific percentage improvements, because if i did... i would require all the data points for my specific games, which means *I* would have to test it , because no reviewer keeps logs of the games *I* play over decades, since they baseically use the "current" popular games in their reviews, which change every few years (like 3-4)

sure, i could use 3dMark for base reference, but it's a benchmark, not an actual performance indicator of *my* specific games, and will give different estimates for gaems based on which benchmark you're using. ie: Port Royal estimates i'll get 90 fps in battlefied x at 1440p, while fire strike estimates i'll get 110 fps in the same game, with the same hardware....

MMOs are hard to guage, because they're constantly updating their game engine and adding features. LOTRO added DX10 shadows, "frill" (grass affected by wind and character interaction) , RIFT added multicore utilization. A lot of single player games even added Ray-Tracing

not to meantion that case, airflow, ambient temperature, and water cooling all can make significant differences that can result in throttling... then CPU/Mobo/RAM will affect the percentage increase as well....

tl;dr: take all the reported testing with a grain of salt, and use vague terms like "roughly" when upgrading...

PS: there wasn't any real "gaming" reason to go from 9th generation intel to 10th or 11th gen. the performance didn't seem to change much until the 12th gen came out with DDR5 etc...

in actual answer to the OP question: yes, a 40x0 card will be an improvement, prolly twice the framerates of your 1080 in games. your processor would be a much bigger bottleneck gaming at 1080p. almost not an issue at 4k. at 1440p, it will still be just fine, and allow you a couple more years to let DDR5 boards mature and PCIE 5.0 devices to start coming out... if you store your games on a SATA SSD, you're good to go. NVMe drives don't really improve gaming over SATA SSDs yet.

a 40x0 series card will offer you more gaming improvement than a core system overhaul, but will cost more, too.

ALSO: a 40x0 card offers ray tracing, while a 1080 does not... which means 1) there's more games available to play, and 2) some games will perform differently than expected because you can enbale ray-tracing in them
Post magically merged:

I game at 1440p. I'd like to upgrade to an Nvidia card that has 200% of the perf. of my 1080ti, but I don't know if that's realistic or possible considering the rest of the computer would be holding it back.
this quote is for reference.... The OP said 200% of current... NOT +200% of current. so, correctly: +100%, or double the current performance