• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does the RTX 3000 Series Have Too-Little VRAM?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

rainless

Old Member
Joined
Jul 20, 2006
https://www.overclock3d.net/news/so...t_4k_ultra_settings_-_the_rtx_3080_has_10gb/1

This would bother the hell out of me if I plunked down 800 bucks for a top-of-the-line card... only to discover I couldn't play some top shelf games at their highest settings.

Complicating matters is the fact that the Radeon 6000 series cards will all have 16GB of RAM which will be their new standard. (And they'll probably be cheaper.)

Obviously I'm never buying another Radeon card again (mainly due to compatibility issues with the various editing software I use) but it does make me want to sit on my 2060 Super a little longer than I otherwise would've.

And how much VRAM do the next-gen consoles have if they're going to be able to handle something like Godfall?

The argument seems to be that the 3000 series could be outdated before it is even widely available. (Besides the 3090 of course...)
 
I would be more worried that NVidia is planning on releasing the 3000 TI series of cards as soon as AMD releases their 6000 series cards. These cards will have more memory, run faster and have a better price than the already announced 3000 series cards. These TI cards are made specifically to compete against AMD's offering and are the real NVidia cards.
 
The lower amount of memory on the Nvidia cards are not supposed to be a problem. This is the reason that Nvidia has spent a-lot of $$$ to develope DLSS 2.0. This takes lower res (1080p or 1440p) and up scales it to 4K so that 12GB,14GB,or 16GB of memory is not needed.
IIRC AMD is working on their own version. Currently AMD uses a higher res (5k or 6k) and then lowers it to 4k to do it’s AA. This is all about who can do MSAA/FSAA without a big performance hit.
 
The lower amount of memory on the Nvidia cards are not supposed to be a problem. This is the reason that Nvidia has spent a-lot of $$$ to develope DLSS 2.0. This takes lower res (1080p or 1440p) and up scales it to 4K so that 12GB,14GB,or 16GB of memory is not needed.
IIRC AMD is working on their own version. Currently AMD uses a higher res (5k or 6k) and then lowers it to 4k to do it’s AA. This is all about who can do MSAA/FSAA without a big performance hit.

Well that's precisely how camera manufacturers are doing hi-rez video... they take a 6k sensor and downscale it to 4K for image quality... or an 8k sensor and downscale it to 6K. I can tell you... from a video point-of-view... it works EXTREMELY well.

But I do believe we're nearing the point where video card manufacturers need to stop cheaping-out on us with VRAM.

My editing software... DaVinci Resolve... will take as much VRAM as you have. And the more you have the more layers of effects (nodes) you can add to your video.

If we look at VRAM historically... the amount usually doubled all the time. Then it hit 1GB VRAM... 2GB... 4GB then, for some reason, they released a 1060 with only 3GB of RAM and a higher end version with 6GB... which seemed like a step backwards to me. It so confused me I skipped the entire "1000" series.

Should've been 8GB. And the 3000 series, in my opinion, should already be at 16GB. (It's the next logical thing.) And the 3090 series... instead of 24GB (which... again... is just a weird number...) should be 32GB.

I'm actually happy that Radeon stuck to doubling the amount of RAM because it makes... sense.

I just don't want a Radeon card. I want NVIDIA to do that.
 
The other thing that skews things slightly is that Nvidia uses GDDR6X memory. How much of an affect does that have? I don’t think we know really at the moment, is it the equivalent of 12GB if gddr6? It will take a game like godfall being benchmarked with a 3080 and 6800xt to see where both cards are at with vram usage.

I would say this as well, the thing that makes me say that the 3080 is a little short on memory is you can already see games like doom when maxed out at 4K uses 9GB of vram. Vram usage was also highlighted when Nvidia used doom to show the 3080 being over 100% better than the 2080 as that only had 8GB. However if the resolution was lowered slightly the difference was about 70% performance increase. Doom is an ok looking game but there are better looking games out there with better textures and I would assume higher vram usage.


 
I would be more worried that NVidia is planning on releasing the 3000 TI series of cards as soon as AMD releases their 6000 series cards. These cards will have more memory, run faster and have a better price than the already announced 3000 series cards. These TI cards are made specifically to compete against AMD's offering and are the real NVidia cards.

There was already news that 16/20GB versions of 3070/80 were canceled but also that nvidia is working on higher Ti models like 3090. Considering current problems with production, even if they release anything like this then will take couple of months. Afaik, AMD also has problems with production. I just received an info that review samples are limited because AMD can't deliver enough chips on time.
 
To answer the OP, it depends on use case. Typically 10GB is fine for 4K UHD. If you mod textures and still pour on the AA at 4K (you can get away with a lot less due to pixel density of 4k screens), that could become a problem in a few titles over the years. From your own article...
Thankfully, gamers can counter these VRAM limitations with lowered texture settings in most games. With Microsoft's Xbox Series S offering users 10GB of combined system memory (CPU+GPU), it is unlikely that 8GB of 10GB graphics cards will become unable to play modern PC games anytime soon.

Also, there is a difference between memory allocation (set aside for) and memory use (frequently used/using). Just because this title says it needs 12GB doesn't necessarily mean the gaming experience will suffer if the card only has 10.

Lastly, compression algorithms for memory. AMD and NV use them so that helps.

I wouldn't worry about the 10GB of RAM thing unless you're a modder and play games at 4K UHD.

bigtallanddopey said:
I would say this as well, the thing that makes me say that the 3080 is a little short on memory is you can already see games like doom when maxed out at 4K uses 9GB of vram. Vram usage was also highlighted when Nvidia used doom to show the 3080 being over 100% better than the 2080 as that only had 8GB. However if the resolution was lowered slightly the difference was about 70% performance increase. Doom is an ok looking game but there are better looking games out there with better textures and I would assume higher vram usage.
Doom on Nightmare mode is one of the most VRAM intensive games we have out there today. Their testing was at 4K, so it wasn't a lower resolution. Just because a game looks better, doesn't necessarily mean that it needs more VRAM. There are better-looking games than Doom that use less VRAM. Don't let that be your barometer.

I just received an info that review samples are limited because AMD can't deliver enough chips on time.
Unless they told me the same lie, I can confirm this sentiment as well, sadly. When reaching out to partners for samples, two mentioned they wouldn't have their cards ready until 'late Dec or even 2021'.....I think we'll see reference AMD cards available out of the gate, but no or very few partner cards initially.
 
Last edited:
There was already news that 16/20GB versions of 3070/80 were canceled but also that nvidia is working on higher Ti models like 3090.

I read that was whatever was called the 3080 20GB was "cancelled", not that there wouldn't be a higher VRAM card similar to a 3080 but with a different name, be it Ti, Super, or something else. Obviously the cancellation is form an nvidia/AIB perspective, you can't cancel what didn't exist from a consumer viewpoint.

Think the latest rumour is a 3080Ti matching the core configuration of the 3090 but still slightly lower VRAM at 20GB, so also lower bandwidth compared to 3090. This can then go directly against the 6900XT on specs and possibly ball park pricing.


Unless they told me the same lie, I can confirm this sentiment as well, sadly. When reaching out to partners for samples, two mentioned they wouldn't have their cards ready until 'late Dec or even 2021'.....I think we'll see reference AMD cards available out of the gate, but no or very few partner cards initially.

So not looking good for people wanting to buy ANY "latest gen" GPU this year? When I saw the AMD guidance to avoid an nvidia like scenario, I wasn't hopeful their supply would be much better.




Back on the eternal AMD vs nvidia debate, choices are never simple. Who knows exactly what will or will not be important for forward facing games. I had written on another forum in the past that I don't think being a GPU benchmark/reviewer is going to be as simple as it has been. Fanboys of either side are already taking up positions. AMD side are saying only raw throughput should be compared. Obviously, this negates DLSS which can provide a good boost, and at potentially higher perceived visual quality than without. I'd hope any reviews will cover both scenarios, so we have a good idea what the underlying power is like, but also how much more advanced features can give on top of that. A visual quality comparison will also be required in the case of DLSS. The VRAM quantity argument I find particularly amusing given we're also getting DirectStorage, which may help to offset that.
 
Think the latest rumour is a 3080Ti matching the core configuration of the 3090 but still slightly lower VRAM at 20GB, so also lower bandwidth compared to 3090. This can then go directly against the 6900XT on specs and possibly ball park pricing.
Correct.

So not looking good for people wanting to buy ANY "latest gen" GPU this year? When I saw the AMD guidance to avoid an nvidia like scenario, I wasn't hopeful their supply would be much better.
For AIB cards, that's what it seems to be. :(


RE: difficulty to be a GPU reviewer...... no doubt. LOL. With the advent of RT, that required additional testing. Not a huge deal, but more time. Now with AMD getting involved, that's going to take even more time. Non RT and raster based testing along with RT.

Now with AMD introducing SAM, this presents another challenge. Performance attributed to SAM is only found on 5000 series CPUs with B550/X570 boards. There are literally zero people with that setup (for the next day). So how do you benchmark it? Why should we? You have to use both Intel and AMD 5000 series CPU to test otherwise, you give AMD the shaft (and the growing user base with 5000 series and b550/x570) and pump up intel. At the same time, very few will realize this performance uptick from using this specific AMD ecosystem...and even over time, a vast majority will be running Intel or AMD non 5000 series or 5000 series w X470/B450 that doesn't support it. They won't take a notable market share loss and eat it up within a year, or even two. If they priced these lower, perhaps, but you're right in the ballpark of Intel pricing at this point so although these chips will take the performance crown, the same complaints about being too expensive will hold those same people back, likely.
 
RE: difficulty to be a GPU reviewer...... no doubt. LOL. With the advent of RT, that required additional testing. Not a huge deal, but more time. Now with AMD getting involved, that's going to take even more time. Non RT and raster based testing along with RT.

This is almost getting me interested in PC tech again, but only as an observer, not as a tester myself this round. All these new features will add complexity that we need to understand.

Even the basic permutations are going to be numerous:
Zen 3 CPU + RX6000, SAM on and off difference
Zen 2 CPU + RX6000
Intel CPU + RX6000
RX6000 Rage Mode
RX6000 PCIe 3.0 vs 4.0 difference (don't expect much, but just to confirm)
Various CPUs + Ampere, DLSS on and off difference

We've not had much on the navi RT performance, so that will be something to watch when the time comes.

Zen 3 is about to go on sale very soon, so presumably the reviews will hit then. It'll be interesting to see how that goes in gaming. My gut feeling is the 5800X will be the best pure gaming CPU, with the higher models only making sense if you need those extra cores for other use cases. 5600X doesn't make a lot of sense to me, unless biased towards older games that don't use so many cores, but we'll see how that goes soon enough. Nearly 2 weeks until we find out what the GPUs do too.
 
To answer the OP, it depends on use case. Typically 10GB is fine for 4K UHD. If you mod textures and still pour on the AA at 4K (you can get away with a lot less due to pixel density of 4k screens), that could become a problem in a few titles over the years. From your own article...

Oh yes... I've recently become aware of this. I didn't think I would have any chance of playing Watch Dogs: Legion in 4K with Ray Tracing enabled. All of the articles I read seamed to indicate that this was impossible. Even the game itself threw up BIG... RED warning labels that seemed to indicate that both CPU and GPU usage would be "VERY HIGH."

...they weren't. Not with most of the settings at "Medium." Not even close.

I really didn't think I'd be able to pull that off with a 2060 Super... but here we are.
 
Last edited:
On the AMD GPU shortage. IIRC I read somewhere that the PS5 had sold more in pre-orders than the PS4 did in it’s first 3 months😱
I’m not sure how this will affect their CPU’s as that is from the zen2 series. IIRC the zen3 is also from the 7nm node so extra sales may affect it.
 
On the AMD GPU shortage. IIRC I read somewhere that the PS5 had sold more in pre-orders than the PS4 did in it’s first 3 months😱
I’m not sure how this will affect their CPU’s as that is from the zen2 series. IIRC the zen3 is also from the 7nm node so extra sales may affect it.

Sony and Microsoft would have had to place their orders in for their chips long ago so that is a known demand. If they negotiate any increase in those numbers, it would still take time to go through the system since it isn't something you do on short notice.

Zen 2 at least is more within AMD's own control. Presumably they would have planned a phase out of production of those, and phase in of Zen 3 to manage continued supply of existing and new demand. The unknowns would mostly be from not knowing exactly what the demand is. There is more uncertainty in the world than ever, so what might have done in the past may no longer be the case and we'll see shortages in some areas if they were not predicted when preparing the products.
 
Back