• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

So, who's getting a 5XXX series card?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
TL;DW -- woof, don't even attempt to recommend the 8GB card. It has significantly worse performance than the 16GB model.

It is okay at 1080p medium and low settings, but who is buying a $400 card to target 1080p/medium settings in 2025? It's too expensive for 1080p and too underpowered to run at 1440p/4k. What a step down from 2 generations ago there was at least a 12GB 3060ti.

Which was weird enough! The whole "10GB/12GB" era... I talked about it earlier in this thread. I know there's some weirdo NVIDIA egghead efficiency to it... but the simple matter is that 8GB is barely enough for anything these days.

There simply shouldn't BE 8GB cards anymore.

Everything I do... from 3D scanning, to video editing, to gaming... would benefit from having more than 8GB of memory.

All of it would benefit GREATLY from 16GB or higher.

That should be the new minimum in 2025.

8GB should be an uncomfortable memory like the 3GB cards...
 
Ooh, waiting for the monthly Steam Hardware Survey refresh, but very roughly from memory, in last months, it was about 1/3 below 8GB, 1/3 at 8GB, and 1/3 above 8GB. Games are not going to drop 8GB support any time soon. That doesn't mean they'll run at max settings, but it is far from unusable.

Some people may push the limits with what they do, but the vast majority don't. I've not personally been limited by 8GB for video editing. I have hit it for gaming, but it was at 4k which was already a stretch for the GPU.

I laugh at the many rants online about how unusable 8GB is, then the same people often then say if it was cheaper it would be ok!

IMO I see it as:
8GB - lowest cost gaming
12GB - mainstream value
16GB - mainstream performance
>16GB - "the best"
 
For sure there's still a place for 8gb cards today. As mack said, most are still gaming at 1080p in the first place. And most of the cards out there are 8gb or less!

At 1080p There's only a few games, today, that can eclipse 8gb vram use with ultra settings. And all it takes is to turn some settings down if your gameplay is suffering (how a game responds to running out of vram also varies).
 
There is still a place for 8GB cards, but not at the current price. Somehow, Intel figured it out, Nvidia can't, and soon, expect new AMD with 8GB, too. AMD always adjusts its prices to match Nvidia's price/performance or be slightly cheaper. If you get it for games, then the whole PC price should be somehow comparable to consoles, while new 8GB cards cost as much as Xbox, and you still have to buy all other components. If you don't play games, then IGP is more than enough for most tasks.
 
If you get it for games, then the whole PC price should be somehow comparable to consoles, while new 8GB cards cost as much as Xbox, and you still have to buy all other components.
Not really a fair comparison since a new 8GB card based system will outperform main consoles, plus PCs can do more outside that. It has been rare when a new PC could come close to console pricing, but consoles also are lower performing and you make that tradeoff along with more limited game choice and pricing if you go console or not.
 
but the simple matter is that 8GB is barely enough for anything these days.

There simply shouldn't BE 8GB cards anymore.
...if it only was that simple (it's not remotely that simple)...

That said, in some respects, you're right in that more vRAM can help those users who are 3D scanning, video editing, and gaming. But to be fair, at least to me, those who are 3D scanning (using a device that costs AT LEAST twice as much as the GPU needed to utilize the device) and video editing heavily, they probably shouldn't get these cards in the first place.......use your head and buy the right tool for the job(s) you need done. Sometimes you have to pay to play properly. You want to compete on a race track, a stock Honda Fit won't cut it.

To me, THAT'S the argument, not cards with lower vRAM capacities shouldn't exist. If you look at TPU game reviews an overwhelming majority of titles don't eclipse the 8GB mark at 1080p/ultra... and again, you can lower settings to help if you're gaming experience isn't optimal. Most users only have 8GB OR LESS and run 1080p.

HUB isn't wrong, it's too damn expensive for what it is.... a $300 1080p card with 8GB in 2025... sure (9060 XT, anyone?). To me, it's the pricing that's out of line, not the vRAM capacity.
 
...if it only was that simple (it's not remotely that simple)...

That said, in some respects, you're right in that more vRAM can help those users who are 3D scanning, video editing, and gaming. But to be fair, at least to me, those who are 3D scanning (using a device that costs AT LEAST twice as much as the GPU needed to utilize the device) and video editing heavily, they probably shouldn't get these cards in the first place.......use your head and buy the right tool for the job(s) you need done. Sometimes you have to pay to play properly. You want to compete on a race track, a stock Honda Fit won't cut it.

To me, THAT'S the argument, not cards with lower vRAM capacities shouldn't exist. If you look at TPU game reviews an overwhelming majority of titles don't eclipse the 8GB mark at 1080p/ultra... and again, you can lower settings to help if you're gaming experience isn't optimal. Most users only have 8GB OR LESS and run 1080p.

HUB isn't wrong, it's too damn expensive for what it is.... a $300 1080p card with 8GB in 2025... sure (9060 XT, anyone?). To me, it's the pricing that's out of line, not the vRAM capacity.

That's where I sit. If it was half the cost that's one thing as a low-end card. But for $500 to have issues at 1080p with high settings in 2025 is unacceptable especially with the popularity of 1440p+ screens. To know that your limitation is vram
 
I'm imagining that things are looking up with US stock of NV cards.....

Before launch we (Microcenter - employees) were told we could not buy any GPUs until further notice (5K and 9K series). As of today, we are allowed to purchase NV cards (except a 5090), and can use the employee discount. That wasn't much on MSRP, but is a ton at current prices.

EDIT: It's not a ton for employee discount... looks like that went up with the cards.... :(
 
Last edited:

TL;DW -- woof, don't even attempt to recommend the 8GB card. It has significantly worse performance than the 16GB model.

It is okay at 1080p medium and low settings, but who is buying a $400 card to target 1080p/medium settings in 2025? It's too expensive for 1080p and too underpowered to run at 1440p/4k. What a step down from 2 generations ago there was at least a 12GB 3060ti.
The 3060ti is 8gb the 3060 is 12GB... On a 3060ti I play lots of Tower Defense type games at 4k and most AAA games work at 1440p with 2k textures... stuff like The Ascent and Cyber punk just hold 30fps at 1080p.

That said I would love a new card, but there seem to be problems with the 50 series in some games such as Control, as well as "unstable drivers" since December. The new 5080 Super card might be 24GB according to today's rumors... maybe the drivers will be stable by then?
 
I wouldn't worry about the drivers unless you know there's problems in the games you play. It's not like it's predominant. :)
 
Steam hardware survey numbers are out:

50800.38%
5070 Ti0.28%
50700.38%
Just posted that earlier in a new thread! https://www.overclockers.com/forums/threads/steam-hardware-survey-april-2025.806573/

Anyone know what the population size is? Would be interesting to work backwards from percentages to units to get some idea about how many have been sold.
Given it is a monthly survey, to be counted someone would have to have logged in at least once during that month. We don't have that number. In the distant past there have been estimates of monthly active users somewhere above 100M, but that's very old now. I'd guess it is even higher now. Sample size doesn't need to be the entire population, and someone better at statistics can work out the sample size required for a given confidence level.

Best I can offer is the online user count, which is shown officially.
Recent peak is 36.6M. Even if we use that, 0.38% like for the 5080/5070 would be around 140,000 units each recently online. This does assume a perfectly even spread of units and login behaviour, which isn't guaranteed. 5070 Ti would be around 100,000 by same method. Since not everyone is online all the time, the real number would be somewhere far above this.
 
Last edited:
...if it only was that simple (it's not remotely that simple)...

It is.


That said, in some respects, you're right in that more vRAM can help those users who are 3D scanning, video editing, and gaming. But to be fair, at least to me, those who are 3D scanning (using a device that costs AT LEAST twice as much as the GPU needed to utilize the device)

You've had a 3D printer for a while now... You should know better. If not... then I'm telling you now. Black Friday's the time to buy. Einstar is probably the most respected name in 3D scanning. I got their original scanner for 700 bucks on Black Friday. A Creality scanner would've cost me 600. That's not twice the price of any graphics card we're talking about. (At least it's not twice what I paid FOR my card... but that was kindof a ripoff in retrospect. Maybe I got mine too-early.)

use your head and buy the right tool for the job(s) you need done. Sometimes you have to pay to play properly.


...and yet sometimes you pay and STILL can't play properly There's no reason why I shouldn't be able to do everything I need to do at the X060 level. I'm not trying to run the universe in real time for NASA... I'm just trying to scan a couple car parts and edit some independent films.

If AMD can get something like a 9070 going at 16GB... and if there's a 16GB TI option... already for the 4060 (and I think the 3060 had a 16GB option), then your excuse for keeping 8GB around is full of holes. Doesn't hold water.

FLAT is what I'm saying... :)

The impetus isn't on the buyer... we don't get a say in the manufacturing. The impetus is on the SELLER. It's not hard... when you've got a few hundred billion dollars laying around... to read the tea leaves and know that it's been "X" number of years and games and applications are starting to max out 8GB.

Time to bump it up.
 
(and I think the 3060 had a 16GB option)
3060 was 12GB because Nvidia decided at the time 6GB wasn't cutting it.

Not everyone has the same needs, and especially on the really low end, cost is more sensitive a factor. Balancing costs is something that has to be carefully considered and adding more VRAM they don't need or want is extra unnecessary cost. Before there is any misunderstanding, 5060 probably could do with going 12GB perhaps as a super refresh. Nvidia would have projected all this years ago but there are many things outside their control. If 3GB modules weren't available in volume in time, 8GB is what they have to go with until that changes.
 
I wish it was as black and white as you're making it out to be...but it isn't, and there's reasons listed for that already.

You've had a 3D printer for a while now... You should know better. If not... then I'm telling you now. Black Friday's the time to buy. Einstar is probably the most respected name in 3D scanning. I got their original scanner for 700 bucks on Black Friday. A Creality scanner would've cost me 600. That's not twice the price of any graphics card we're talking about. (At least it's not twice what I paid FOR my card... but that was kindof a ripoff in retrospect. Maybe I got mine too-early.)
lol @ the admonishment...wtfbbq? I don't want a 3D scanner.....I have no need for one and don't have the extra cash for a $700++++ toy I don't need.

Well aware of Einstar. I just brought in one of their handheld scanners to the store, in fact ($850). Often, These scanners are well over $1,000....

...and yet sometimes you pay and STILL can't play properly There's no reason why I shouldn't be able to do everything I need to do at the X060 level.
They you didn't pay enough to play. You, quite literally, have the lowest card of the product stack for that generation, bub. There's nothing slower or cheaper in that generation. Don't get me wrong, I would like to see bottom budget card do well for everyone in any space, but that's just not the reality, and has been for a while now. I think your expectations are out of whack.. or wishful thinking.

The impetus isn't on the buyer... we don't get a say in the manufacturing. The impetus is on the SELLER. It's not hard... when you've got a few hundred billion dollars laying around... to read the tea leaves and know that it's been "X" number of years and games and applications are starting to max out 8GB.
Considering a majority of users are still using 8GB (or less...and 1080p), I'd say they're doing a good job of 'reading the tea leaves' for the majority... you're just not the majority. You're a power use on a budget... I don't envy that position.

Time to bump it up.
Again, there's a place for 8GB cards... and that's 1080p budget gaming.


I guess we'll agree to disagree. I understand where you're coming from, but it's not cut and dry, or black or white, or easy to please everyone. I'm sorry that you're trying to get blood from a stone. :(


If 3GB modules weren't available in volume in time, 8GB is what they have to go with until that changes.
Exactly. And why we're now seeing rumors of 3Gb ICs that will mix in and bump the vRAM up of the Supers.

 
Ultimately, it would have been much more reasonable for NVIDIA to set the minimum VRAM for the 5060 series at 12GB. As I’ve mentioned several times, if you’re spending around $500 on a brand-new graphics card in 2025 and still playing at 1080p, that card should absolutely be able to handle high or ultra settings without stability or performance concerns.


Higher-tier cards are intended for 1440p and 4K gaming, so it’s frustrating to see a midrange product in this price bracket struggle at 1080p simply due to limited VRAM. It feels counterintuitive to spend that much and then be told you'll need to lower settings to avoid performance issues.


Unfortunately, the current market seems willing to accept that tradeoff. Maybe if the 8GB variants don’t sell well and sit on shelves, we’ll see their price drop to something closer to $350—which would be a bit more compelling. Still, even at that price point, the limitations are hard to ignore. Many of NVIDIA’s promoted technologies like DLSS, Frame Generation, and Ray Tracing actually increase VRAM demands, which shortens the effective performance window unless you're willing to compromise on quality settings.


At this point, I’d argue that most users would be better off buying a used 3080 Ti, 4070, or 3090 for under $4-500. You’d get significantly more VRAM and much better performance headroom across the board, especially if you're trying to play modern titles with enhanced features.
 
As I’ve mentioned several times, if you’re spending around $500 on a brand-new graphics card in 2025 and still playing at 1080p, that card should absolutely be able to handle high or ultra settings without stability or performance concerns.
I think part of the problem, at least in my eyes, is the currently inflated pricing. MSRP on the 8GB 5060 Ti card is $380. I'd imagine the 5060 8GB will be closer to $300...now............if you can buy them at that price, is a different story, so it is what it is now, I get that. The answer is to wait, or, as you said, get an older, higher tier card with more vRAM to serve your purpose. But the answer, to me, ISN'T to pour more vRAM on these budget models and make them more expensive for the minority who want more RAM.
 
Oh, 100% agreed, the absolute largest issue is the value proposition due to the inflated prices.

The 5060ti on newegg [with 8GB] has a few cards in stock ranging from 420 [nice] to 470 with some out of stock at over $500.

Ultimately, I think it can be summed up that I'm tired of all of the #winning.
 
3060 was 12GB because Nvidia decided at the time 6GB wasn't cutting it.

Not everyone has the same needs, and especially on the really low end, cost is more sensitive a factor. Balancing costs is something that has to be carefully considered and adding more VRAM they don't need or want is extra unnecessary cost. Before there is any misunderstanding, 5060 probably could do with going 12GB perhaps as a super refresh. Nvidia would have projected all this years ago but there are many things outside their control. If 3GB modules weren't available in volume in time, 8GB is what they have to go with until that changes.

Yeah but it's been a loooong time at 8GB. And I kinda think... with Unreal Engine 5 (and whatever comes after it...) we DO kinda all have the same needs at this point.

Monster Hunter Wilds and Robocop (of all things...) we're the first games I'd run into in a LOOOONG time time (possibly ever) that were unplayable for me.

And I don't even CARE about framerates or anything. They would literally freeze up or crash.

I've heard similar thing about Oblivion Remastered and some other recent games.

We're FAST approaching that limit where 8GB just won't be enough.
 
Back