- Joined
- Oct 27, 2004
- Location
- Upstate NY and NYC
My question is around the heat output of my monster... Currently, my office rises to a toastie 98F from a starting ambient of 68F after about two hours of gaming. Yes, seriously, it's an oven in here.
So, a friend of mine and myself came up with an explaination. I know my deltaT is way outta wack, around 21C (see below). I know this because I have ambient dedicated sensors telling me room temp after 2 hours of "heating up" (from the floor, it's 29C, and it's 35C when standing up) and I have water temp sensors on the output of the radiator (~50C to 52C at times). That equates to 21C, way too high, I know. Office temps start at 20 to 21C when I start gaming, and increase to 35C in an hour or so - hot.
Our theory is since the water temp is 125F (52C), and I am running fans over the radiator, this basically is creating a serious heater in my office - exhausting the heated water.
What if I lowered the deltaT down to say 10C? Now, the heater output of the system is far lower, with a room of ambient around 21C. So that's water temp around 31C, or 87F, opposed to the 125F as before. That could be managable.
To achieve a lower deltaT, say I double my radiators. Yes, lower deltaT, and lower "heat" output of the radiators.
So the $64,000 question: We can't determine if the extra radiators that are now removing additional heat, output the same amount of overall heat expelled as as the 124F temp setup.
Basically, does lowering the deltaT lower the overall room temps?
(more information, if need be)
So, I designed my system with a gasping 17C deltaT on paper. I was dead on, but I didn't calculate the 33% overclock of the GPUs to 955mhz. This basically equates to a 21C deltaT now after a few hours of folding or gaming.
It sucks, I know, and I am in the process of adding more rads and breaking out the CPU into a dedicated Dell H2C w/Peltiers (different story for another time).
I'm hoping by lowering the deltaT that the heat output is far lower overall.
So, a friend of mine and myself came up with an explaination. I know my deltaT is way outta wack, around 21C (see below). I know this because I have ambient dedicated sensors telling me room temp after 2 hours of "heating up" (from the floor, it's 29C, and it's 35C when standing up) and I have water temp sensors on the output of the radiator (~50C to 52C at times). That equates to 21C, way too high, I know. Office temps start at 20 to 21C when I start gaming, and increase to 35C in an hour or so - hot.
Our theory is since the water temp is 125F (52C), and I am running fans over the radiator, this basically is creating a serious heater in my office - exhausting the heated water.
What if I lowered the deltaT down to say 10C? Now, the heater output of the system is far lower, with a room of ambient around 21C. So that's water temp around 31C, or 87F, opposed to the 125F as before. That could be managable.
To achieve a lower deltaT, say I double my radiators. Yes, lower deltaT, and lower "heat" output of the radiators.
So the $64,000 question: We can't determine if the extra radiators that are now removing additional heat, output the same amount of overall heat expelled as as the 124F temp setup.
Basically, does lowering the deltaT lower the overall room temps?
(more information, if need be)
So, I designed my system with a gasping 17C deltaT on paper. I was dead on, but I didn't calculate the 33% overclock of the GPUs to 955mhz. This basically equates to a 21C deltaT now after a few hours of folding or gaming.
It sucks, I know, and I am in the process of adding more rads and breaking out the CPU into a dedicated Dell H2C w/Peltiers (different story for another time).
I'm hoping by lowering the deltaT that the heat output is far lower overall.