- Joined
- Jun 8, 2002
- Location
- Melbourne, Australia
Just throwing this one out there for discussion.
Now for a perfectly insulated heat die who's only source of heat dispersal is through the waterblock, then the opening statement is going to be false because watts in equals watts out through the waterblock, so no matter what, the water will always be receiving the same amount of heat and therefore will always be the same temperature given a water cooling source of fixed performance and the system reaching equilibrium.
What the opening question relates to is for CPU systems. I believe it has been fairly conclusively established that the CPU actually dissipates a not insignificant amount of heat through the CPU pins and into the area of the motherboard surrounding the socket. This is most easily noted when watercooling a CPU and watching the on-die thermistor. When the case is well ventilated and a fan is introduced to blow over the socket area substantially lower on-die temperatures are seen (ranging from 2-5C for modern CPU's). This is observed when the water-cooling radiator is distantly removed from the case where it cannot be impacted in any significant way by the socket air-flow.
So clearly there are two cooling paths here, that being the primary cooling path of the waterblock, and the secondary cooling path of the motherboard.
Therefore in a computer system we now have a tied relationship between the waterblock cooling performance, the motherboard cooling effect, and the resultant water temperature.
Let's say, for example, that the motherboard has a C/W of 1, and the waterblock has a C/W of 0.25, being figures that I believe are not too distant from reality. This would mean that the cooling system has a total C/W of 0.20. Now I could be wrong on this but I'm treating the thermal resistance as electrical resistance, using using Rt = (R1 X R2) / (R1 + R2) to arrive at that figure.
Let's simplistically assume that the water is a constant ambient temperature for now.
If the CPU is emitting 60W of heat then the CPU will reach 12C above ambient, for which 48W of that heat is being dissipated by the waterblock, and 12W by the motherboard secondary heat paths. ie. 48W of heat is entering the waterblock, and therefore the water.
Now let's assume that we plug in a block with a C/W of 0.20. The total thermal resistance of the system is now 0.1666 C/W. (1 x 0.2) / (1 + 0.2)
For the same 60W CPU, the CPU now reaches 10C above ambient, and now the waterblock is dissipating 50W of that 60W heat total. (10 / 0.2)
ie. 2W more heat is entering the water with the more efficient water-block.
Is there something wrong with my logic? 2W is not a lot, and it's certainly not an amount large enough for most anyone to pick up on by watching their water temps (re: HardOCP comments about more efficient waterblocks overloading a radiator), but if the theory is sound, then the effect is still there. It's not zero, and it's not large, but it is real. Yes?
Now for a perfectly insulated heat die who's only source of heat dispersal is through the waterblock, then the opening statement is going to be false because watts in equals watts out through the waterblock, so no matter what, the water will always be receiving the same amount of heat and therefore will always be the same temperature given a water cooling source of fixed performance and the system reaching equilibrium.
What the opening question relates to is for CPU systems. I believe it has been fairly conclusively established that the CPU actually dissipates a not insignificant amount of heat through the CPU pins and into the area of the motherboard surrounding the socket. This is most easily noted when watercooling a CPU and watching the on-die thermistor. When the case is well ventilated and a fan is introduced to blow over the socket area substantially lower on-die temperatures are seen (ranging from 2-5C for modern CPU's). This is observed when the water-cooling radiator is distantly removed from the case where it cannot be impacted in any significant way by the socket air-flow.
So clearly there are two cooling paths here, that being the primary cooling path of the waterblock, and the secondary cooling path of the motherboard.
Therefore in a computer system we now have a tied relationship between the waterblock cooling performance, the motherboard cooling effect, and the resultant water temperature.
Let's say, for example, that the motherboard has a C/W of 1, and the waterblock has a C/W of 0.25, being figures that I believe are not too distant from reality. This would mean that the cooling system has a total C/W of 0.20. Now I could be wrong on this but I'm treating the thermal resistance as electrical resistance, using using Rt = (R1 X R2) / (R1 + R2) to arrive at that figure.
Let's simplistically assume that the water is a constant ambient temperature for now.
If the CPU is emitting 60W of heat then the CPU will reach 12C above ambient, for which 48W of that heat is being dissipated by the waterblock, and 12W by the motherboard secondary heat paths. ie. 48W of heat is entering the waterblock, and therefore the water.
Now let's assume that we plug in a block with a C/W of 0.20. The total thermal resistance of the system is now 0.1666 C/W. (1 x 0.2) / (1 + 0.2)
For the same 60W CPU, the CPU now reaches 10C above ambient, and now the waterblock is dissipating 50W of that 60W heat total. (10 / 0.2)
ie. 2W more heat is entering the water with the more efficient water-block.
Is there something wrong with my logic? 2W is not a lot, and it's certainly not an amount large enough for most anyone to pick up on by watching their water temps (re: HardOCP comments about more efficient waterblocks overloading a radiator), but if the theory is sound, then the effect is still there. It's not zero, and it's not large, but it is real. Yes?