• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does AMD need more VRAM to do the same thing?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Kenrou

Member
Joined
Aug 14, 2014
Do AMD GPUs use more VRAM than Nvidia GPUs to do the same thing? My recent benchmarking shows AMD GPUs allocating more VRAM when available, and now I am seeing the 8GB 7600 underperforming vs the 8GB 4060 in games that seem to go beyond the 8GB allocation on AMD hardware but not Nvidia hardware. I'd like to see more testing on all of this, as this video is really just a byproduct of my UE5 benchmarking, and the testing wasn't really designed with isolating this variable in mind.

 
Haven't come across this guy before. The question he raises is certainly interesting. I'm not half way through the video yet but will also check out more of their stuff.
 
Nvidia has had better texture compression for some generations already, so should use less VRAM. The data request probably ends on a "smaller data package", but the predicted allocation shouldn't be different.
In my direct comparison between RX7600 8GB and RTX4060 8GB in benchmarks and games, I wasn't comparing used VRAM, but both cards were as useless in some more demanding titles or at 4K and larger textures. It was not even about the VRAM size, but the architecture. I mean, who cares if a card has 5 or 15 FPS when you can't play at so low FPS? Both cards are not designed for 1440p or higher, and max details. The RTX4060 is generally a bit faster too.
 
It's crazy... but just by knowing your username that's EXACTLY what I thought you'd look like...

...but not AT ALL what I thought you'd sound like! hahahaha :D
 
Back