• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

rtx 40x0 talk

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Putting pricing to one side for now, it is comparable to 3090 in basic performance and RT performance beats almost all AMD, with the XTX only clawing a win at 4k. This is no slow GPU.

Of course pricing does matter, but "it's complicated". I'm seeing higher end Ampere disappear from shops presumably as stocks are finally exhausting. And they never got the hoped for fire sale pricing everyone wished for when GPU mining died. For my personal uses, if I had to buy a new higher end GPU right now, it would probably be this. That doesn't mean I don't wish for lower pricing, but moaning about it isn't going to change much.
 
i wasn't moaning about the price... i was moaning about how the performance gain, or practical lack thereof, from my current setup isn't really worth it for me. if i wasn't flat broke, the only card i would consider right now for an upgrade would be the 4090. (or 2x 3090 ti NvLink setup, but that would be older tech, lacking the newer features)
my current build notwithstanding, for someone building a new PC, it looks to be a great card for a more value/ budget oriented build than mine.
 
man it should have just been a 4070 tbr. biggest problem for this card is lack of memory bw IMO, even if it has fast memory ic's. its like NV went out of their way to gimp the cards to be worse then 3000's, all except the 4090/4090TI. if the memory width was wider it would do way better at 4k, cant say it wont.
 
Many manufacturers are still releasing refreshed RTX3000 series cards. In the last few days, Biostar announced a new series of RTX3000 and GTX1600. I could understand the last gen, but why bring back the GTX1600? Maybe they have a specific market for that, but it for sure won't sell well in US/CAN/EU.
 
They are going to bring back the 1600 series to fill the price range that should be the discounted lower range 3000 cards. That's my opinion anyway.
 
It's a bit frustrating for sure. Like 3000 was all lip service with theirow MSRPs that the high demand market wouldn't support. But it makes me want to go back and look at what really changed with this architecture. I think reciewers are going for the throat because it was originally going to be a 4080, but is so cut down. I don't know that the average person buying cards really knows much about it, they just have money and spend it. The high end is still more of a play for reputation IMHO.

I think the sad part is such a disregard for the customers. I don't blame anyone for making money, and I'm sure in a vacuum it's a fine card, but it seems really like a poor deal for a generational improvement. I think as much as the supply chain has improved on the consumer end, it's still costing too much at the fab to get lower end cards out. On the other hand nVidia specifically stated (privately) that the plan was to hold back any ADA silicon that would compete with 3000 cards. So maybe it's positioned specifically to be a bad deal, for those who absolutely must hae the newest thing but can't payore than 1k, and basically taxing the name.
 
Marketing is strong with Nvidia. Really, no matter what they release, it will sell good as there are still people who think that new is always better. Most gamers are clueless and it's like 90% of all who buy new and expensive graphics cards. I play online games for years and when I see the BS flood on the game chat or discord or wherever then I just close it as it's hard to find words and not to offend all around.
For most users, when the PC is getting slow in anything, it's the moment when they want to buy a new PC. No matter what is the reason, it's like my PC is old and slow, I need a new one. The new one can have about the same performance but fresh OS and they are happy.
 
Why not just introduce the 4030/4050/4060 and sell them instead (newer/faster)?
 
Generous opinion: Manufacturing the 40 series still costs too much to make profits at those levels.
Critical opinion: nVidia is holding 40 series silicon back except for the high end to promote sales of 30 series GPUs without having to lower their prices. The supply was so limited for so long that once manufacturing caught up they had too many and now they are manipulating the market to make money off of them. (This is real, I don't remember the source but it was legit leaked from nVidia).

Reality is probably a combination of the two. Also for reference, the lowest 30 series card was the 3050, with the 1650 series taking up after that. Basically if it's too slow for ray tracing, nVidia doesn't want to waste new silicon on it.

Also note the 3070Ti was initially supposed to be a 3080 12GB to be marketed along side the 3080 16GB, except the card has a completely different and more cut down die, so in response to outrage at the deceptive marketing (well I don't need 16GB so I might as well get the 12GB and save $200 without realizing that I'm getting a different GPU). This is why the Ti came out before the 3070.

Yes it can beat a 3090Ti in a lot of titles. But also in a lot of titles the delta between the 30 series cards shrinks a lot in comparison to a 4090. On the other hand this thing performs more like a 30 series in a lot of ways. Sure it's a good card (is it good for the price, that's up to you), but it's not what was promised.
 
I could understand the last gen, but why bring back the GTX1600? Maybe they have a specific market for that, but it for sure won't sell well in US/CAN/EU.
1650 became the most common GPU on Steam Hardware Survey a month or two ago. Someone is buying them, or has bought them. The previous most popular 1060 is declining. I got a 1650 when it came out years ago as it was the highest performing GPU at the time that didn't need a power connector. I'm not sure if anything has taken over that spot.

Marketing is strong with Nvidia. Really, no matter what they release, it will sell good as there are still people who think that new is always better.
Another factor is what is the alternative? AMD? If you limit it to raster only they might offer slightly better perf/cost but in general they cost less because they offer less. Intel might take another generation or two to really get going. Doesn't feel like AMD are rushing to offer mainstream GPUs either. If people can't afford gaming PCs they might get consoles, which AMD get a cut from.

Why not just introduce the 4030/4050/4060 and sell them instead (newer/faster)?
I'd really love to see what a modern lower end GPU would do. Both sides offer 5 tier previous gen, but it could be some time before current gen gets that area.

On the point of sticking to the higher end while they shift Ampere stock, I wonder if we're close to that point yet. At least in the UK, higher end (3080+) are few and far between now. More mainstream like 3060 still has options.
 
1650 became the most common GPU on Steam Hardware Survey a month or two ago. Someone is buying them, or has bought them. The previous most popular 1060 is declining. I got a 1650 when it came out years ago as it was the highest performing GPU at the time that didn't need a power connector. I'm not sure if anything has taken over that spot.

In earlier years (5-7 years ago), the most common GPU on Steam was Intel IGP. These stats are based on the most commonly used GPU in all games. Most people play things like the sims or football manager. Even sports series like are not very demanding but very popular.
In the last 2 years laptop market was flooded by mobile GTX1650 and GTX1660 in the cheap gaming laptop series. This can be one of the reasons there are many more of those in all rankings, as GTX1660/Ti is also popular, and mentioned GTX1060 is still high on the list.
 
i think it would be more about mining and cost for 16xx cards.

they were a popular option to mine with for a while (as were many other cards) and their price to performance in both Mhash and FPS is pretty good.
I bought my 1660 Super for a small emulation system and it works well.

not every one wants ray tracing everrything, some people just want decent power at a decent price. I grabbed it because at the begining of the zombie apocalypse for ~$200 because of that reason, decent FPS, decent price, vulkan renderer, nvenc. this was before the run on mining cards.

i'd still be using it if i wasnt now using that emulation system daily since the rest of my life... i mean main system is still packed away in a box some where. i stuck a 3070ti in it since i got it cheap too and wanted to make sure it was kosher for, again, when i unpack my life. the 1660 will return to this system
 
i would really like to see some LP cards from both NV and AMD tbh. mainly for older machines where you just want the decoding on the chip more then anything.
 
well i still have this first gen i5 mobile chip on a industrial board. has a LP 1030GT in it, would be nice for something a bit newer.
 
LP = low profile? Probably not much choice there other than 1650 on nvidia side and 6400 on AMD. You'll have to check for specific features but at least the AMD has two generations on nvidia's offering.
 

"It seems the issue of melting RTX 4090 cables is back in the spotlight. This time, the problem is affecting a custom cable designed and manufactured by CableMod.

The company wasted no time in announcing the completion of its proprietary 12VHPWR cables, which were meant to be direct replacements for the stock NVIDIA connectors. However, it appears that similar melting issues with the original NVIDIA cables are also present with the replacement cables. The problem of melting cables is related to users that incorrectly plug the cable. The new 16-pin cables are harder to plug all the way down, so many users were not plugging them correctly. Since NVIDIA made this statement, the number of reported cases has dropped significantly, which suggests that this might have been the cause of the problem all along."
 
So I have joined the 40 series ranks, and I purchased a cable-mod 600w cable which was delivered early December and now being put to use on an MSI 4090 X Trio. I'm not TOO worried about connector failure, as I quadruple checked it to be sure it was seated complete, and inspected the contact of the connectors with a flash light just to be sure it could not be seated more.

I will say, it didn't take nearly as much pressure to seat it as the reports have been stating. I'm assuming that with the influx of new people entering the DIY PC scene, they may not be familiar with how connectors seat.

At the same time, I'm not keen to the thought of "It *should* be fine, as the cable does have some good bowing to it where it exits the card and turns down from the side panel of the Lian Li O11 Dynamic XL. I really hope that the 90 degree and 180 degree adapters are finalized and made available on the sooner side.

Once cable-mod releases the angle adapters, I think the best orientation will be to have the cable lay against the back of the card, or exit downwards, eliminating any/all strain on the connector and offering more ideal bends for cable failure prevention.
 

Attachments

  • 20230120_095819_1.jpg
    20230120_095819_1.jpg
    297.9 KB · Views: 11
I also got an msi 4090 gaming x trio today. I already had a 3 way cable mods cable set aside for my 4080 that I never hooked up but lo and behold the 4090 x trio comes with a 3 way connector in the box so I went ahead and used what I had. If msi thinks it's enough then I trust it. Seems to be fine so far.. Only thing is my big case with top mounted power supply wouldn't let the cable mods cable reach under the card through the closest hole so I just came through overhead and zip tied it to the motherboard power cable to keep it from cooking on the back of the video card. It's a slight eyesore but better than having all of those bulky connectors from the adapter hanging off the card.

PS: Ran guardians of the galaxy in 4k with everything cranked and no dlss until I was pretty sure it was at 100% usage. So far the 850w power supply seems to be doing fine but I just got it so I'll eat my words if anything changes. It said 430w usage for the card in HWmonitor.
 
Last edited:
Back