• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Memory??

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Joined
Jan 4, 2024
Location
Indiana
If someone explain to me that how come the ram in your computer can't somehow be used by the GPU? Also in 2024 I think any GPU:unsure: over $500 should at least have 16 GIGS of Vram.
 
Because the latency/speed between accessing the system RAM vs the integrated on the GPU vRAM (GDDR6x, for example) is a lot slower.

Also in 2024 I think any GPU:unsure: over $500 should at least have 16 GIGS of Vram.
Random take...fair enough, lol.

The ones with 12GB aren't choking on any res it's appropriate for. So, you'll have more you may not use and pay for it. I'd be plenty happy buying a 12GB card today for 2560x1440 and most 4K games. So long as you aren't modding games, 12GB is plenty... remember most users are at 1080p (~60%)..... with 4K representing a mere 4% (2560x1440 is around 17%) according to Steam stats. So, that's cool to wish for more, but, most users just plain don't need it. And by the time they do, the GPU can be too slow to utilize it for modern titles in the first place. ;)
 
Because the latency/speed between accessing the system RAM vs the integrated on the GPU vRAM (GDDR6x, for example) is a lot slower.


Random take...fair enough, lol.

The ones with 12GB aren't choking on any res it's appropriate for. So, have more you likely won't use and pay for it, lol. I'd be plenty happy buying a 12GB card today for 2560x1440 and most 4K games. So long as you aren't modding games, 12GB is plenty... remember most users are at 1080p (~60%)..... with 4K representing a mere 4% (2560x1440 is around 17%) according to steam stats. So, that's cool to wish for more, but, most users just plain don't need it. And by the time they do, the GPU can be too slow for modern titles in the first place. ;)
Ok I now understand but IMO I play 1440p 144hz 1GtG and I do have a few games that come close to 12 Gigs. And game like alien wake 2 and cyberpunk 2077 I might not be able to play on highest settings. I have the first alien wake on highest settings it is like 50 FPS and 60 FPS. My 2 1080 tis sc2s does not play crisis on max settings and is using the 2 cards. I know we all want the best when it comes to gamming, but I have learned over time sometimes you just can't max out the settings.
 
I have the first alien wake on highest settings it is like 50 FPS and 60 FPS.
Does that have anything to do with vRAM limits, though? If you're using 12GB it could... otherwise, it's just the game/coding.

Remember, just because a game can allocate more on a card that has more, doesn't translate directly to better FPS. Allocated and in use are different things. It's why in some titles, vRAM is 'full' but yet there aren't any performance penalties.

In the end, 16GB is better, but most of the time it isn't the limiting factor is all I'm getting at. I'd rather have the more powerful card with less video RAM for how I play my games (2560x1440, basic ultra settings, turn down to reach FPS targets if needed). However, if you play 4K now and have a longer GPU lifecycle (read, like 5 years), it can be beneficial. Would I buy a 4070 Super for 4K? No... I would get 4070 Ti Super for 4K.

My 2 1080 tis sc2s does not play crisis on max settings and is using the 2 cards.
SLI doesn't add up the memory. It's mirrored. ;)

but I have learned over time sometimes you just can't max out the settings.
That's true but it isn't always attributed (I'd say more rarely) to a lack of vRAM. If you're running out of vRAM, you have too little card for the resolution you're trying to play.
 
I play at 4K 60 with my 12GB card, and I am fairly content. Sure there are some games that go out of their way to make things difficult, but it is usually an easy fix to get back up to 60.
 
I've only seen one game use > 12 GiB w/my 4090: Deus Ex: Mankind Divided but that was at 4K resolution. The Last of Us Part 1 comes close to using 12 GiB.

I've managed to get Terminator: Resistance to use even more VRAM (18825MiB) and system RAM (13725MiB) by stupidly modifying the config files.
 
If someone explain to me that how come the ram in your computer can't somehow be used by the GPU? Also in 2024 I think any GPU:unsure: over $500 should at least have 16 GIGS of Vram.
GPUs can use system ram as spill over. The thing is, you usually don't want it to because it is so much slower to access. Latency has been mentioned but IMO doesn't matter. When you're transferring GBs of data, bandwidth is king. Picking a 4070 as an example, that has just over 500GB/s of bandwidth on card. Say you have a dual channel DDR5 system running at 6000 MT/s. That's just under 94GB/s. But that isn't the only problem. It has to go over PCIe between GPU and ram. PCIe 4.0 x16 is only 32GB/s bandwidth. Even doubling that for PCIe 5.0 when it eventually arrives and you're still quite a bit short. NVLink provided an additional connection outside of PCIe. On GA102 that offered 56GB/s each direction. It helps, but still far below what the GPU local VRAM is capable of.

IMO the perceived problem of VRAM quantity is more on some gamers who have unrealistic expectations of applicable settings. While I still fall into the same trap myself using the following terms, "high" or "ultra" are not standardised across game developers. Appropriate settings can be picked for the hardware. There is no guarantee you can use highest presets at high resolutions with any GPU, although a 4090 certainly would help. For cross platform games, the better current gen consoles on "performance" setting are roughly equivalent to low to medium settings on PC. Anything over that is a nice bonus for PC gamers.
 
I'm running 1440p 165hz (120fps+ realistically on most games), and my 8gb are more than enough, the only game so far that I've had to cherry pick settings because of VRAM was The Last of Us, and with the latest patches even that runs at mostly ultra settings without hitting the limit. Side note though, I don't usually use RT 🤷🏻‍♂️
 
GPUs can use system ram as spill over. The thing is, you usually don't want it to because it is so much slower to access. Latency has been mentioned but IMO doesn't matter. When you're transferring GBs of data, bandwidth is king. Picking a 4070 as an example, that has just over 500GB/s of bandwidth on card. Say you have a dual channel DDR5 system running at 6000 MT/s. That's just under 94GB/s. But that isn't the only problem. It has to go over PCIe between GPU and ram. PCIe 4.0 x16 is only 32GB/s bandwidth. Even doubling that for PCIe 5.0 when it eventually arrives and you're still quite a bit short. NVLink provided an additional connection outside of PCIe. On GA102 that offered 56GB/s each direction. It helps, but still far below what the GPU local VRAM is capable of.

IMO the perceived problem of VRAM quantity is more on some gamers who have unrealistic expectations of applicable settings. While I still fall into the same trap myself using the following terms, "high" or "ultra" are not standardised across game developers. Appropriate settings can be picked for the hardware. There is no guarantee you can use highest presets at high resolutions with any GPU, although a 4090 certainly would help. For cross platform games, the better current gen consoles on "performance" setting are roughly equivalent to low to medium settings on PC. Anything over that is a nice bonus for PC gamers.
Thank you for that great expiation:).
 
I'm running 1440p 165hz (120fps+ realistically on most games), and my 8gb are more than enough, the only game so far that I've had to cherry pick settings because of VRAM was The Last of Us, and with the latest patches even that runs at mostly ultra settings without hitting the limit. Side note though, I don't usually use RT 🤷🏻‍♂️
I hear you man. One thing that annoys the piss out of me about armchair reviewers is that for eons they were complaining 8GB was not enough in 202x.. while those guys were complaining, I was playing at 4K with my 3070Ti. And a bunch of those same guys crying about 12GB cards when they don't even own the damned hardware lol.

Ugh :blah:
 
how about we go back to when you could buy ram to add on to your video card. i know the ram was alot slower back then but still. Matrox millennium II with 4mb, 4mb ram add-on card. even older ISA video cards you could just pop the chip in, granted ISA had socked ram spaces for adding more in. the matrox card had pin like sockets, think like hooking your you led/hd led/power button type sockets. though if NV/Amd started doing that ram add-on cards are going to cost. then NV/AMD will pass along the cost of board re-design and money lost on you not buying the higher end card with more ram. one way or the other they are going to make their money off you, be it a $500 8gb card or $500 12gb card.
 
Last edited:
how about we go back to when you could buy ram to add on to your video card. i know the ram was alot slower back then but still. Matrox millennium II with 4mb, 4mb ram add-on card. even older ISA video cards you could just pop the chip in, granted ISA had socked ram spaces for adding more in. the matrox card had pin like sockets, think like hooking your you led/hd led/power button type sockets. though if NV/Amd started doing that ram add-on cards are going to cost. then NV/AMD will pass along the cost of board re-design and money lost on you buying the higher end card with more ram. one way or the other they are going to make their money off you, be it a $500 8gb card or $500 12gb card.
The area in red is on board ram, area in yellow is addon ram. VLBus video card :)

Picture 003a.jpg
 
All I know is that I got suckered into the whole double the RAM thing with NVIDIA 5200. This is back in the day when RAM was measured in MB not GB. I spent out extra on two cards with double the RAM thinking that I was doing good. Turns out the whole platform (5200) was junk. That was then, this is now.
 
Back