• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

older rig: worth upgrading to Nvidia 4xxx series?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I have an old 9700k @ 5.1 Ghz. paired with 32 GiB DDR3980; would it be worth it to upgrade my aging 1080ti to the 4xxx series or would too much performance be lost due to my older hardware? Please note upgrading both my system and videocard is financially impossible for me.
 
I would wait and upgrade to something like a new i5 or i7/Ryzen 5 or 7 and something like RTX4070 when it is released. The balance would be better than pushing one component only.
 
Yeah, wait until the more budget-oriented cards come out (that will still cost an arm and a leg). What resolution do you game at? If you're at 1080p, then that CPU would likely hold things back enough to where I wouldn't do it. If you're playing at 4K or 2560x1440 (and not getting a 4090, lol) any losses would likely be negligible.

Consider updating your signature so we know exactly what you're working with (monitor), and a budget.

Niku has a great idea too of upgrading both by getting a last gen card. A 3070Ti/3080 (anything less than that is too close to the 1080Ti IMO) is a significant upgrade (30%+ at 1080p, slightly higher with higher resos). With the money you saved, you can look into an updated AMD/Intel system (maybe intel as you can still use DDR4 with the latest processors).
 
I game at 1440p. I'd like to upgrade to an Nvidia card that has 200% of the perf. of my 1080ti, but I don't know if that's realistic or possible considering the rest of the computer would be holding it back.
 
eh maybe bottleneck?
stock clock was 3.6 with a boost of 4.9 and it sounds like you've nailed it to 5.1 all core all the time. at that rate i might be looking at cpu ram and mobo before gpu. and i consider efficiency per clock as well as raw horsepower too. if your getting 10% more done per Ghz OCed or not then again more reason to go this route.
but at the same time it feels like intels new gens are baby steps and 9 series doesnt feel that long ago.

either way you spin it your looking at like $1200USD or more, thats a nice gpu or mobo, cpu and ram either way you spin it.
at 1440 i'd probably go cpu ram mobo if i were in your position

or wait and see what else happens.
we've seen intel cpu/gpu, nvidia gpu, amd cpu.
we've got amd gpu coming up soon and supposedly amd cpus with 3d vcache in the spring
 
I game at 1440p. I'd like to upgrade to an Nvidia card that has 200% of the perf. of my 1080ti, but I don't know if that's realistic or possible considering the rest of the computer would be holding it back.
...aaaaand..... budget?

Not sure a 4090 is 2x faster than a 1080ti in the first place. Did you read any reviews? That said, you'll obviously need a 4090 to come close to that goal. Either way it will be a significant improvement even with your current system.
 
Not sure a 4090 is 2x faster than a 1080ti in the first place.
It's way more than 2x, depending on how you measure it. 2x faster might not be 2x more fps, since fps limits can occur for many different other reasons. For example, if you can run a given fps at 1440p on one, and the other allows you to run the same settings and fps but at 4k, you're doing roughly 2x the work.

Techpowerup's rating puts the 4090 at 3x the 1080 Ti although I have no idea how they get that number. On their scale, 2x a 1080 Ti would be 3080 Ti/3090 territory. If you look at compute that ratio is 8x for 4090.
 
It's way more than 2x, depending on how you measure it. 2x faster might not be 2x more fps, since fps limits can occur for many different other reasons. For example, if you can run a given fps at 1440p on one, and the other allows you to run the same settings and fps but at 4k, you're doing roughly 2x the work.

Techpowerup's rating puts the 4090 at 3x the 1080 Ti although I have no idea how they get that number. If you look at compute that ratio is 8x.
Mack, I just went by 1440p results. Perhaps my extrapolation was off?

45% faster than a 2080ti - https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/32.html

A 2080 Ti is 25% faster than a 1080 Ti - https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html

I don't see 200% from that ( I know it's not directly additive)...but I haven't had my coffee either.

200% faster than 100 fps is 300 FPS (right... or should I go get that morning caffeine? :chair: :escape:

Techpowerup's rating puts the 4090 at 3x the 1080 Ti
Where's that?

Is there a review that has a 1080Ti in there so we can see h2h results instead of extrapolations? I'm talking raw FPS here like the OP, not upping the res and settings. He's not upgrading the monitor so I'm not sure why that's a point at this time. At a more GPU-bound res (4K), the new cards do pull away, indeed.


EDIT: I see here that it's almost 100% faster than a 2080Ti - https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/31.html

... but a 2080Ti isn't close to 100% faster than a 1080Ti so.......... help me understand, lol!!!
 
Last edited:
When you check the FPS comparison on TPU then 4090 is more than twice as fast as 1080Ti (it's about 2x faster average than 2080).

@ed 200% faster or by 200%... After 7h of work already so starting to get brain hiccups, but in my understanding first would be 200FPS and the second one 300FPS ;)
 
100% = 2x as fast as the original number. So if I was running 10 FPS, 100% faster is 10 FPS. 10+10 = 20. 200% faster is 20 FPS on top of the 10 we started with.

When you check the FPS comparison on TPU then 4090 is more than twice as fast as 1080Ti (it's about 2x faster average than 2080).
I already linked above, it's not 100% faster than a 2080 Ti........ look at the last link I provided and check the average FPS for 2560x1440. A 2080 Ti is ~104 FPS while the 4090 is ~200. 200-104 - 96. 96 of 104 isn't 100%/2x.

Sorry if I'm math-challenged this morning.... :rofl:
 
Last edited:
100% = 2x as fast as the original number. So if I was running 10 FPS, 100% faster is 10 FPS. 10+10 = 20.

I already linked above, it's not 100% faster than a 2080 Ti........ look at the last link I provided and check the average FPS for 2560x1440. A 2080 Ti is ~104 FPS while the 4090 is ~200. 200-104 - 96. 96 of 104 isn't 100%.

Sorry if I'm math challenged this morning. OP asked for 200% better so.... :rofl:
If ~100FPS = 100% then ~200FPS = 200% ;) ... and we talk about 1080Ti, not 2080Ti for the 2x higher performance. Looking at TPU's average FPS, 1080Ti would be somewhere around ~80FPS compared to ~200FPS of RTX4090. So RTX4090 is over 200% performance of the GTX1080Ti (average looking at TPU tests).
 
Mack, I just went by 1440p results. Perhaps my extrapolation was off?

45% faster than a 2080ti - https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/32.html

A 2080 Ti is 25% faster than a 1080 Ti - https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html

At the first link it says 2080 Ti is 54% of 4090, or if you flip it around, 4080 is 185% of 2080 Ti, or 85% faster than 2080 Ti. (100/54)

At 2nd link, similarly 1080 Ti is 75% of a 2080 Ti. Or, 2080 Ti is (100/75) = 33% faster than 1080 Ti.

If you combine the two, it is 1.33 * 1.85 = 2.46x the speed of 1080 Ti.

If this type of scenario where the performance is so different, I'd be hesitant to use raw fps especially as CPUs become the limit as GPUs get faster, and under-represent their performance.
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-founders-edition/33.html
Where's that?

That's the page for 1080 Ti. Note the relative performance section. Scroll down to see where 3080/4090 lie.
 
For me this morning Bart, that's not how math works (LOL)...100% of 100 is 100. So if something is 100% FASTER it's 2x the base value. 100% performance increase = 2x performance. What the OP actually wants, no idea....


If this type of scenario where the performance is so different, I'd be hesitant to use raw fps
One link, same data and CPU (below). I can't help the 4090 would be CPU limited if he buys it. We have to run with what we have and not make concessions for higher resolution, different settings, and faster CPUs.............
If you combine the two, it is 1.33 * 1.85 = 2.46x the speed of 1080 Ti.
Ok... then what about this page 2560x1440?

2080 Ti averages ~105 FPS.... the 4090 averages ~200 FPS. 95 FPS faster. 95 against 105 is not 100%/2x, right?

EDIT: IF that's true above, and we know a 2080Ti isn't 100%/2x faster than a 1080Ti, how does that math work????

EDIT2: Did TPU mix data... what's this??
"Based on TPU review data: "Performance Summary" at 1920x1080, 4K for 2080 Ti and faster."

So, that chart that shows the 4090 is running at 4K, yet the 1080Ti is 1080p? Whaaaaaaaaaaaaa?
 
Last edited:
2080 Ti averages ~105 FPS.... the 4090 averages ~200 FPS. 96 FPS faster. 95 against 105 is not 100%/2x, right?
199.9 / 104.7 fps = 1.91x, or 91% faster. So no, not 2x. Also, that's comparing 4090 to 2080 Ti. OP has 1080 Ti. That'll probably push it over. I'd argue 1.9x is close enough to 2x it doesn't matter even though its the wrong GPU.

On the other point about looking at fps, keep in mind these testers would pick some setting. Results could vary up or down depending on what actual settings were used. The more demanding the settings, the more likely you are to realise potential performance differences.
 
199.9 / 104.7 fps = 1.91x, or 91% faster. So no, not 2x. Also, that's comparing 4090 to 2080 Ti. OP has 1080 Ti. That'll probably push it over. I'd argue 1.9x is close enough to 2x it doesn't matter even though its the wrong GPU.
I know.. we don't have a dataset that compares the 1080Ti directly to a 4090. Do we both agree that a 2080Ti wasn't close to 2x the performance of the 1080Ti? I disagree that it will push it over because of how close the 1080Ti and 2080Ti in performance. 2080Ti = 25% faster than 1080Ti which makes the 1080Ti 33% slower. Sadly, we don't have the average FPS from that old review or we could find out.

Why in your math are you multiplying the two cards % difference together? Wouldn't you add that??? Try that math with FPS and see if it makes sense (still sipping caffeine..........).

On the other point about looking at fps, keep in mind these testers would pick some setting. Results could vary up or down depending on what actual settings were used. The more demanding the settings, the more likely you are to realise potential performance differences.
This is absolutely true.... and perhaps worth a mention for the OP. However, for our purposes in data mining, that's pointless information.... we have what we have for this purpose. External factors can't be involved in the data portion since we have no data for the other settings, etc.


Please see my edits above.............




EDIT: Ok, so a RTX 2070 is slower than a 1080Ti by ~10% (1440, according to RTX 2070 FE review at TPU). If we go by that with the chart we have (average FPS), the 2070 runs at 70 FPS. 4090 at 200. 130 FPS against 70 is 185%. So, I guess you can round up 15% to make it 2x if you want... but the 2070 is also 10% slower so, I surely wouldn't round up. But using those values, it's not even 2x so what's closer to the truth? Is it "way more than 2x" or is "not sure it's 2x faster than a 4090"?

Please answer my edit here too... apologies for all the edits.........
 
Last edited:
I know.. we don't have a dataset that compares the 1080Ti directly to a 4090. Do we both agree that a 2080Ti wasn't close to 2x the performance of the 1080Ti? I disagree that it will push it over because of how close the 1080Ti and 2080Ti in performance. 2080Ti = 25% faster than 1080Ti which makes the 1080Ti 33% slower. Sadly, we don't have the average FPS from that old review or we could find out.
Agree 2080 Ti isn't 2x 1080 Ti but that's totally irrelevant. The link you gave yourself earlier showed 4090 is 1.9x 2080 Ti. That's near enough 2x in my book. The difference between 1080 Ti and 2080 Ti (or similarly 3070) is big enough I'm confident it will more than make up that difference between 1.9x and 2x.

You got the 2080 Ti vs 1080 Ti numbers a bit off. 2080 Ti is 33% faster than 1080 Ti, or 1080 Ti is 25% slower than 2080 Ti. You may need a refresher when it comes to using percentages.
 
Forget the %... use the average FPS values I found... https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/31.html

Do the math for the 2080/2080Ti and 2070 and show me how they compare to a 4090. Here's my math......


2070 = 70 FPS
2080 = 85 FPS
2080 Ti = 105 FPS
4090 = 200 FPS

200-70 = 130. 130 is 185% faster than a 2070.
200-85 = 115. 115 is 173% faster than a 2080.
200-95 = 105. 105 is 135% faster than a 2080Ti

Full circle adding in the 1080 Ti........it sits right between the 2070 and 2080 (according to THIS at 1440/same res). So if my math is right (lulz) the 1080Ti is somewhere between the 2070 and 2080 %-wise too?


Thanks again for humoring me and spelling things out. :)



EDIT: @mackerel F me... I see where I made this go wrong. OP asked for 200% improvement. In my reply (post #7), I mistakingly said 2x. Grrrrrrrrrrrrrr

2x /= 200%. 2x = 100%. I know this (even mentioned in post #11)

Then in my head/posts I used 200% = 2x like dumbarse knowing full well that's not how it is. So, in the end, my replies were based on 200% difference the OP is looking for and NOT 2x/100% like I mistakingly said. Big oof here. Apologies! The good news, we're both right, lol. The 4090 is NOT 200% faster than a 1080Ti. It IS over 2x faster, however.

:facepalm::chair: :escape:
 
Last edited:
200% of 2 is 4 right?
Yes.

Now peep this...
An increase of 100% in a quantity means that the final amount is 200% of the initial amount (100% of initial + 100% of increase = 200% of initial). In other words, the quantity has doubled. An increase of 800% means the final amount is 9 times the original (100% + 800% = 900% = 9 times as large).

So when we're talking in the context of faster than X (comparing) like we are here.....200% faster than 2 = 6, you can't forget about the base/original number (see the quote above).


(According to the TPU links above...)
The 2070 averaged 70 FPS. 200% faster than 70 = 210. The 4090 averaged 200 FPS so it's not quite 200% faster. than the 2070.

The 2080 averaged 85 FPS. 200% faster than 85 = 255. The 4090 is even further away from 200% faster.

1080 Ti performance is somewhere between the 2070 and 2080. If the 4090 isn't 200% faster than those, it's not 200% faster than the 1080Ti. It IS 2x faster, or 100% faster than all of those, yes.
 
Back