Cjwinnit said:
Hmm...
Maybe it's time we made a few calculations.
Points per Kilowatthour.
Could be interesting.
We'd need a few figures, of course, for the related computational power / KW-h:
1) system usage in watts = A
2) MHz (or PR rating, if AMD) = B
3) waste heat in watts = C
4) a way to estimate how many watts the AC has to expend to cool the watts from (3). = D
Regarding A:
A = monitor wattage + CPU wattage + peripheral wattage. (roughly)
It might be difficult to translate the periperal DC wattage to wattage used by the power supply, but given a level of efficiency 0 < E < 1 of the power supply, it could be approximated by
peripheral wattage (AC)_0 = peripheral wattage (DC) / E,
which would then need to be translated to a RMS wattage, AFAIK. I'll call the final result peripheral wattage (AC).
For the CPU wattage, AFAIK, take its rated wattage and multiply it by the % overclock, which I'll term 0 < OC < 1:
CPU wattage = rated CPU wattage * (1+OC).
Regarding C:
Generally, you'll having a rating for the CPU waste heat,
rated CPU heat,
which you'll need to multiply by your OC factor from above. If you've raised your voltage by a percentage VOC, your waste heat will be
rated CPU heat * (1+VOC) * (1+OC).
I'm not sure about monitor heat, but it should be given by the manufacturer. If not, here's a crude methodology for a monitor:
i) Take your monitor's rated power consumption and find the rated power consumption of a similarly-sized LCD monitor. We'll take the LCD value as the baseline to create an image of that size.
ii) Assume that some percentage of the difference in power consumption is the wasted energy, which will be dissipated as heat. This will provide an upper bound on the heat generated by the monitor, but it shouldn't be too far off, since they do get pretty warm, and you need to remember that it's spread over quite a surface area.
iii) If you have an LCD monitor, for our crude calculations, it will probably have negligible effect on the total waste heat.
For our purposes, don't bother with waste heat from the peripherals. We're talking leading-order effects here.
Regarding D:
No air conditioner is 100% efficient. So, to maintain a steady temperature, the AC will have to expend at least as many Watts as you calculated in the previous figure.
Now, on to complications: the wattage required for cooling will increase if the outside temp increases relative the inside room temp, and decrease if it diminishes relative that figure. So, in some way, efficiency varies inversely with (T_outside - T_inside). You could have an effective > 100% efficiency if the outside temp is cool enough, which would be a simplified manner of modeling the heat transfer directly from the inside to the outside through the walls, windows, etc.
Just a few starting thoughts to get the modeling going.
-- Paul