• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Video cards and GDDR memory cooling

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
Considering how close components are together on video cards, would cooling the VRAM have a beneficial cooling effect on other components like the GPU itself or the VRM's? I'm just thinking the GPU has many copper traces connected to the memory (more than 256 traces to each and every memory chip) and copper is good conductor of both heat and electricity
 
Well I believe most video cards cool the VRAM and VRM's with the stock air cooler attached anyway. Not sure I understand what you mean. Do you have a water block on the chip and VRAM and VRM is now exposed?
 
The cooler the better as always, but it really will have no real world effects as the memory does not get hot in the first place.... like system memory. Keeping the VRM's cool is key (but not in the way I think you are getting at).
 
Like ED said, I know that memory chips don't overheat (well, maybe if you push a lot of voltage through them they do), but I was thinking that if you cooled the memory it might help keep the VRM's and GPU cooler since they're attached to the VRAM through many copper traces on the PCB. Even on a 256-bit DDR3 or GDDR4 memory bus there are more than 512 traces connecting the GPU to each and every memory chip and all of them are probably copper as well.
 
But if the vRAM chips aren't getting hot how do you propose keeping them cool?

Cooling happens by a temperature differential. The lower the differential the less effective (and less necessary) the additional cooling.
 
But if the vRAM chips aren't getting hot how do you propose keeping them cool?

Cooling happens by a temperature differential. The lower the differential the less effective (and less necessary) the additional cooling.

+1, as well as what rainman33 said. Most coolers on mid to high end cards (as in the ones where you'd want added cooling) will also cover the memory. And like system RAM, as long has you have any sort of airflow around the area, the chips may get a touch warm, but not to the point where it's worth the cash to get heatsinks for them.
 
Well yes, by Newton's law of cooling, cooling is proportional to the temperature gradient -- as long as the gradient isn't large. The RAM chips are cool, they are connected to the GPU by many fine copper traces, the GPU is hotter. Copper conducts heat and electricity easily.
 
You are welcome to keep spinning your tires, or you can test it yourself since its your theory...
 
surface is everything
i dont know how to explain this without going into horrendous detail or without oversimplifying soo much that it becomes just plain wrong... but i'll try

If you start digging into the properties/capacities of copper, you'll find that its electrical & energetical condictivity is not quite the same...

wikipedia, engineers toolbox, dedicated engineer & physics site....

... thus is comes down to this: the electricity will "move" faster than the heat in the same conduit. OR in order to be able to "transport" 1 W of "heat" , you'll need a much wider trace as when "transporting" 1W of electricity.

So, while your "cooling by proxy" will actually work, its efficiency will be "below expectations"



NOTE: Yes, i know i should have left the Watts out of it because it makes the statement false but the other option woudl be to drag Archimedes & funnels & pipes into it :)
 
Copper wire, even small gauge stuff, seem to heat up pretty quickly when a soldering iron is applied.
 
Copper wire, even small gauge stuff, seem to heat up pretty quickly when a soldering iron is applied.

That is because you are applying direct heat to it and a lot of it. Copper has a good heat transfer capabilities, but it moves electricity faster per RnRollie's statement.
 
No one is saying copper isn't a good conductor of heat. It should be obvious, as any half decent heatsink will have at least some copper in it.

What we're saying is that it's not like the MOSFETs on a low end motherboard that rely on the traces to conduct heat away. Any heat conducted back to the RAM isn't going to be significant enough to get quality heatsinks for the chips. The cost of a set of small heasinks will be about $15-$20. Assuming you're going to be getting an aftermarket heatsink for the GPU itself anyway, it'd probably be better to add that money into buying the heatsink or maybe better fans. :shrug:
 
That is because you are applying direct heat to it and a lot of it. Copper has a good heat transfer capabilities, but it moves electricity faster per RnRollie's statement.

The specific heat of a substance doesn't change WRT the amount of heat being applied. 500 Watts or 5 Watts, it's thermal conductance is the same.

There are a lot of traces connecting the GPU to each memory chip -- especially since it's GDDR3 or GDDR5 which involves a differential data transmission scheme, because that requires two, separate lines for each bit. Then there's the address bus lines. All these lines are connected directly to the die of the GPU -- the hottest part of the IC.
 
Last edited:
Then cool them and see what happens. We all are trying to give you logical responses but you keep coming back with something to negate it. Just do it then and let us know how it works out.
 
Back