• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

rtx 40x0 talk

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Right now, I wouldn't spend money on the 3k series. It's already old. I guess you can wait 1-2 months more. There are sales because Nvidia manufactured too many chips, and they try to sell as many as possible before the premiere of the 4k series. This is also one of the reasons why the premiere of new GPUs is delayed. At first, it was planned for September. Officially was announced as Q4 2022, so they can move it 1-2 months. AMD is also not in a hurry as they release their graphics cards at about the same time as direct competition and adjust prices looking at Nvidia.
 
Got that right! I have $$$ saved for my 4090....but the prices of the 3090 Ti is looking real good. My 2080 Super SLI is long in the tooth. It has been for awhile. When I play Elden Ring on high settings it lags and has a noticeable dip when I swing a sword or try to evade an enemy. But I just have to hold out abit longer. Sept. 20th Nvidia is having some type of press event for the 40 series. Totally fantasizing about how Starfield will look and flow with a new RTX 4090!!
 
$750 for 3080Tis and $1100 for 3090Tis isn't bad at all. Way under MSRP. I think it actually is a good time to pick up a GPU if you want one and have the $.
 
i still wouldn't, nvidia put them selves in this position and they are looking for us to bail them out before the next gen cards come out. frankly the less we buy now of 3xxx cards will have an effect on the prices of the 4xxx cards when they come out. not only that but when 4xxx cards come out and theres still a glut of 3xxx cards then you'll see even more price drops
 
I wanted a RTX3090 ti for this new build, but the price was way too much at the time. I'm on the fence now, but I'm wanting to stay with the theme of new technology. I paid out the booty for this 3060 so......
 
at this point in time i'd rather buy an A770 over any 3xxx cards right now...

and i might if they dont get scalped to hell
 
at this point in time i'd rather buy an A770 over any 3xxx cards right now...

and i might if they dont get scalped to hell
same here but thing holding me back is how will OpenGL/vulkan performance be. have some older wolfenstien games i would like to play without issues, slow for me on the 1060 6gb even at 1080p. really other then waiting on some kind of DX9/DX10? wrapper to work on DX11/12 for some games. it really is a matter of intel getting the drivers/game support/ UI done. i really think intel should take all the different driver/UI teams and put them on the arc to speed things out. A770 priced right for 3060TI performance or slightly higher would be nice, i know we have to wait. Anyone know if the A770 TDP is set in stone, kind of hoping its lower then the 3060TI.

on the other hand i wonder how these would do for FAH,SETI?
 
i dont think it is, i think the 250W was a high estimate from what i've read online maybe an OC.
and as for the dx9 10 and maybe even 11 they should run but since theres not been any native development for them in their drivers they wont run as well as other cards whos drivers have been around for much longer. that makes sense to me, it would take eons to go back and write fixes into the drivers for 25(?) years worth of games that the majority wont be running any way. at that point it sounds like its relying on DX12 at its base to run in some sort of compatibility mode to render these out and lets face it, Direct X is getting a bit old. it works well still and while i am not any sort of programmer or developer i kinda have to wonder if its the most efficient renderer to make most use of these GPUs in general.

or maybe its time to cut backwards compatibility out and continue anew, it might suck but it would be the time to do it...
 
i hate to double post but....

i think they are totally out of touch now with their pricing. Do they not realize you cant mine eth with cards any more? they musta missed that memo

i would have thought what ever higher end 4080 would have been started at a price point of $900, which is still high mind you, but if the rumors are true about the performance increase over 3xxx cards then i think understandable. but $1200, they can go **** them selves.

coupled with the price and the power hungry cards i think it just turned me off this time around
 
I totally agree with you Niku, I'm skipping 40 series, pricing is getting our of hand, power consumption and temps are getting out of hand, the market manipulation is getting out of hand, evga backing out makes me weary of purchasing a new card by another brand ATM, and I'm sticking with 1440P for the foreseeable future. CPUs have my attention at this point, as most of what I play is CPU bound anyway.
 
Agreed, these are going to be expensive, yikes.

But keep in mind that the new generation is going to be a SHED LOAD faster for the same power. 3090ti performance around 285W (4070)...So a $800 card performs like a $1400+ (original MSRP) 450W card from the current gen. So yes, power use is up there, but the performance uptick and efficiency is going to be huge.

What's being held back by a 9900k? 8c/16t is plenty these days (get off the AMD hype train of cores/threads, lol)
 
Last edited:
The price announcement for the real 4080 (16gb) was way higher than I expected. I doubt we will ever have a gpu shortage again like we experience in 2020-2021, so perhaps we will see sales below msrp on occasion?
 
Pricing will go up as long as people pay it. Only demand destruction can cause prices to go down and I don't see that happening this round. As said above the eth mining boom is over so demand will probably drop a good bit but the scalpers will still swipe up all the cards at first so they can upcharge the silly people who must have one first. Other than that the free government money isn't there this round, savings account balances are coming down according to the banks and CC debt is going up. Basically people are running out of money and many are not working to replace it. On top of that GPUs are currently outpacing video game tech and that won't change until the AAA game developers get their act together. Combine those things and I see pricing dropping or at least staying where they are for a couple generations after the 4x, not so much on this one though. I figure nVidia knows this is the last hurrah for a while so they are cashing in while they can.

Personally I have no plans to get a 4x series and figure anyone with over a 2x series card has no need. Especially the 3x series people like me.
 
Agreed, these are going to be expensive, yikes.

But keep in mind that the new generation is going to be a SHED LOAD faster for the same power. 3090ti performance around 285W (4070)...So a $800 card performs like a $1400+ (original MSRP) 450W card from the current gen. So yes, power use is up there, but the performance uptick and efficiency is going to be huge.

What's being held back by a 9900k? 8c/16t is plenty these days (get off the AMD hype train of cores/threads, lol)
And I love the performance gains that come with each new nvidia generation (usually), but they are slowly pricing them out of the game for a lot of their target audience.

9900k is just fine for me right now, even a 12900k isn't enough to warrant an upgrade in CPU yet. CSGO doesn't need to an average above 400fps, just needs to stay above my monitor's refresh rate. Battlefield 2042 barely uses any GPU, but maxes out the CPU occasionally and I rarely ever see over 110 fps (for smoothness sake)
 
yea they are.

has any one noticed if the DLSS 3.0 stuff is locked to 4xxx cards or is it like dlss 1.0-2.0 just worked on what ever card with tensor cores?
 
From what I gathered, DLSS 3.0 is part of the selling point for the 40 series, and I wouldn't be surprised if they make it exclusive to the 40 series, at least for a year or so.
 
has any one noticed if the DLSS 3.0 stuff is locked to 4xxx cards or is it like dlss 1.0-2.0 just worked on what ever card with tensor cores?
The answer might be more nuanced. It looks like nvidia offers Optical Flow SDK to all RTX cards, back to Turing, which is part of the software used in DLSS 3. If it is a software feature and not a new hardware unit, it seems possible to run DLSS 3 on older RTX GPUs from a software perspective if they decide to offer such support. However the question then becomes if there is enough performance for it to be useful in practice? It may be like the GTX cards supporting RT, but the performance is so low no one would seriously use it. Given DLSS 3's main difference over 2 is the extra frame generation, it remains to be seen how that really works when actually playing games.
 
Speaking of DLSS v3, any techies in the crowd that can check if this is true or false when they come out ?

 
Back