We had an interesting discussion here about Storm and Apogee. Let me make a summary:
1. Systemcooling had a test showing that Storm is clearly the better waterblock having significantly lower thermal resistance than Apogee. The tests were conducted using a 14mm x 14mm die simulator and later with 36mm x 36mm simulator.
14mm
36mm
2. When the two waterblocks were tested in real setups using real processors and WC rig, the differences disappeared and Apogee got even slightly better results (lower CPU temperature) than Storm. A few tests:
SystemCooling
Cooling-Masters
OverClock Intelligence Agency
The question is: why the big difference and what should we believe? A few explanations were provided by me and Otter:
"The thermal diode inside the CPU might not be properly placed."
I tried but found no evidence or information about this on modern processors. The thermal diode is definitely located within the core but is it located in the hottest part? If not, then Storm can possibly cool the hottest parts better than Apogee but still get the same temperature reading. On the other hand, the Cooling-Masters test indicates that Apogee was able to heat the water more effectively (CPU-Eau column) - with those two Intel Pentiums. Diode placement should not affect that. As the CPU-Water temperature difference was smaller it is very likely that more heat was transferret into the water.
"The Intel TTV simulator might not simulate reality very well."
Simulators most likely provide very repeatable results. But it is obvious that a copper block simulating a real CPU has to have also some compromises. Can they affect the test results and how much? I found this very new article at Overclockers:
The Evolution of Aftermarket Heat Sink / Waterblock Testing
So which one is better? In my opinion test made with real systems give more meaningful data than tests made with simulators - despite the potential flaws in testing. It is also very common that testing individual components make the differences look bigger than they really are: for most users Apogee and Storm have very similar performance. This holds true also in D5/DDC comparison in another thread.
Any comments on this?
1. Systemcooling had a test showing that Storm is clearly the better waterblock having significantly lower thermal resistance than Apogee. The tests were conducted using a 14mm x 14mm die simulator and later with 36mm x 36mm simulator.
14mm
36mm
2. When the two waterblocks were tested in real setups using real processors and WC rig, the differences disappeared and Apogee got even slightly better results (lower CPU temperature) than Storm. A few tests:
SystemCooling
Cooling-Masters
OverClock Intelligence Agency
The question is: why the big difference and what should we believe? A few explanations were provided by me and Otter:
"The thermal diode inside the CPU might not be properly placed."
I tried but found no evidence or information about this on modern processors. The thermal diode is definitely located within the core but is it located in the hottest part? If not, then Storm can possibly cool the hottest parts better than Apogee but still get the same temperature reading. On the other hand, the Cooling-Masters test indicates that Apogee was able to heat the water more effectively (CPU-Eau column) - with those two Intel Pentiums. Diode placement should not affect that. As the CPU-Water temperature difference was smaller it is very likely that more heat was transferret into the water.
"The Intel TTV simulator might not simulate reality very well."
Simulators most likely provide very repeatable results. But it is obvious that a copper block simulating a real CPU has to have also some compromises. Can they affect the test results and how much? I found this very new article at Overclockers:
The Evolution of Aftermarket Heat Sink / Waterblock Testing
According to Bill Adams there are some definite problems when testing waterblocks with a die simulator: measurement location differences, secondary heat path losses, wrong copper slug dimensions and flatness of the slug face. He also mentioned this Apogee/Storm dilemma and speaks about integrated heat Spreader and thermal interface materials which also contribute to the thermal characteristics of a CPU. All of them should be taken into accound when using a simulator. Very interesting reading."Today, the copper slug with a thermocouple is no longer capable of yielding test results predictive of the cooling device's performance on a specific CPU."
So which one is better? In my opinion test made with real systems give more meaningful data than tests made with simulators - despite the potential flaws in testing. It is also very common that testing individual components make the differences look bigger than they really are: for most users Apogee and Storm have very similar performance. This holds true also in D5/DDC comparison in another thread.
Any comments on this?
Last edited: