• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Cooling, Temperature and the room (split from another thread)

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

bob4933

Member
Joined
Jan 3, 2014
It's true that the 290 is good bang/buck, I just stuck to the 970 since the OP got hung up on power and heat :)

Thats fair I suppose.

and to be very fair, in 1440p gaming, I've yet to see over 55c on my card tho...

EDIT: to amend my initial comment, I was speaking towards the 290 being a space heater; with wild exaggerations about it drastically heating up your room.

At 100% loading (firestrike), my card will hit 88c.

During moderate gaming of -> Counterstrike, Diablo 3, WoW, LoL, Dota 2, APB reloaded, L4D2, indie games, and FFXIV, my gaming loads have never produced a temperature in excess of 55c. This is indicative of the 290 not being fully loaded to 100%. CS benchmark ran at 34% loading on the 290, diablo 3 runs at 40%, unigine heaven runs at 64% with v sync enabled, and fire strike loads to 100%. (I run diablo 3 and CS at 1440p as well)

I am making no implication that a 290 has less thermal power than a 970. The 970 (as I've agreed to many many times) is much more efficient and has a much lower power draw and thermal output than a 290 at the same usage levels. The temperature is a corollary of how much power i am actually using during gaming situations. This is MY observations on my games, and comparing to the OP's desire to play world of warcraft, seems entirely applicable to this situation. If you are playing crysis 3, multi monitor, or running cryengine games or fc4, your usages will undoubtably approach 100% loading on both cards. To which I again, understand the 970 will put out less heat and consume less power.
 
Last edited:
Thats fair I suppose.

and to be very fair, in 1440p gaming, I've yet to see over 55c on my card tho...

You also aren't on stock cooling...

And heat output isn't governed by temperature.
 
Absolutely.

My favorite analogy... which has the higher temperature? A yellow flame on a lighter, or yellow flames in a bonfire?

... They are both the same TEMERATURE, however the heat from the bonfire is clearly a lot more. ;)

55c, Nice! It clearly is because of tbe aftermarket cooler though as they typically run in the 70s and higher.
 
You also aren't on stock cooling...

And heat output isn't governed by temperature.

Correct. Now lets discuss enthalpy; the room will heat up less transferring 55c than it will at 70c. Losses to ambient and forced cooling from say, an air conditioner are less difficult to manage.

I'm going to assume OP has air conditioning...
 
Correct. Now lets discuss enthalpy; the room will heat up less transferring 55c than it will at 70c. Losses to ambient and forced cooling from say, an air conditioner are less difficult to manage.

I'm going to assume OP has air conditioning...

No, it actually doesn't.

If you have a 55°C source dumping 250W and a 70°C source dumping 250W then you're heating up the room an equal amount because your transfer rate of energy is still 250W.
So either way you end up with the same amount of energy in the room given the same amount of time.

I'd take the 100W less GPU every day, it heats up the room a LOT less.
 
No, it actually doesn't.

If you have a 55°C source dumping 250W and a 70°C source dumping 250W then you're heating up the room an equal amount because your transfer rate of energy is still 250W.
So either way you end up with the same amount of energy in the room given the same amount of time.

I'd take the 100W less GPU every day, it heats up the room a LOT less.


No... rate of heat transfer is also based on the temperature differential. You've taken thermodynamics as I have... The amount of energy transfered is yes, the same, but the room will not heat up faster at a lower temperature differential.

edit: Perhaps we could take this to PM then...
 
Before this goes to pm... What would heat up a room faster... 250w of 55c or 150w of 70c? The room cooling will dissipate the 150w of warmer air faster.

Remember I work in a data center. Part of my job Is making sure things are cool in the room. When we design for such things, it is done by wattage, not by expected temps of the hardware. .

Perhaps I will split these post to another thread when I am not mobile...
 
Before this goes to pm... What would heat up a room faster... 250w of 55c or 150w of 70c? The room cooling will dissipate the 150w of warmer air faster.

Remember I work in a data center. Part of my job Is making sure things are cool in the room. When we design for such things, it is done by wattage, not by expected temps of the hardware. .

Perhaps I will split these post to another thread when I am not mobile...

Allow to me break this down completely before going completely off topic.

Your air conditioner has a rated cooling capacity. Assuming a room of constant temperature of 20c, the rate of heat transfer at 70c WILL BE FASTER than 55c. Thats not negotiable. This means the 70c will cool off faster. Since it has less thermal power applied, it means the room requires a lower thermal capacity to cool than the higher wattage 55c. Since we have this magical invention of "air conditioning" we can apply a relatively constant temperature. With a higher thermal power, that simply means the ultimate heatsink (in this case, the air conditioner) will simply work a bit hard.

At the point where the heat sink can not accept any more, thats when it is "heat soaked". The temperature will rise until meets an equilibrium. If need be, I will actually do the math and find out the exact amounts here and can apply it to an average electric bill.

Now in the case of a room NOT having air conditioning, this is of course irrelevant completely. The 970 is definitely less of a thermal load, but physics dictates rate of heat transfer is directly proportional to the actual temperature difference. This is why your temps can drop substantially with an open window in winter, your thermal capacity rises dramatically, and the rate of heat transfer is much faster due to the larger differential in temperatures.

Physics; not the ramblings of an errant bob's mind xD

edit : Also, if we can assume they both have the same exact cooler design, the lower wattage will never see the higher temperature of the higher wattage. If it reaches 70c at 150w, it will be much hotter with a 250w design power.
 
Yes, temp wise it will lower... Heat load is a different story. the bonfire will heat up a 'room' faster than that lighter even though it's the same temp. Wouldn't the inverse be the same?

Any I will split these posts to. A new thread... no mas For now plzzzzz!
 
No... rate of heat transfer is also based on the temperature differential. You've taken thermodynamics as I have... The amount of energy transfered is yes, the same, but the room will not heat up faster at a lower temperature differential.

edit: Perhaps we could take this to PM then...

The same amount of heat has to be transferred either way. If you're putting a set amount of heat into a room it doesn't matter what temperature it comes in at.

Say this, you have a 250W source. To get the temperature change due to the source you need the volume of air in the room, density of the air, and specific heat.

That goes to:
Code:
⌂T = W/(V•ρ•C)

Where:
⌂T = Change in temperature in °F per second
W = 250W (also J/s)
V = 16ft x 16ft x 8ft = 2048 ft³
ρ = 0.0765 lb/ft³
C = 253.2 J/(lb•°F)

Working the equation, you get that the room changes 0.0063°F/s.
That means without any intervention of air conditioning you change the room 22.69°F/hr due to the GPU ALONE!

Now, to figure this out, you never need the room temperature or the heat source temperature...... because it's irrelevant!

Before this goes to pm... What would heat up a room faster... 250w of 55c or 150w of 70c? The room cooling will dissipate the 150w of warmer air faster.

Remember I work in a data center. Part of my job Is making sure things are cool in the room. When we design for such things, it is done by wattage, not by expected temps of the hardware. .

Perhaps I will split these post to another thread when I am not mobile...

This +1. All of it.
You NEVER pick a cooling system by the temperature. It's always by a measure of W, J, or BTU.

Yes, temp wise it will lower... Heat load is a different story. the bonfire will heat up a 'room' faster than that lighter even though it's the same temp. Wouldn't the inverse be the same?

Any I will split these posts to. A new thread... no mas For now plzzzzz!

Thissssss. Same temperature, vastly different amount of ⌂T.
 
My thought here, the temperature of the item being cooled is also different than the air from the cooling device (heatsink/radiator). Wouldn't the lower the the the temp of the item being cooled mean more heat is being removed assuming everything else is the same?
 
My thought here, the temperature of the item being cooled is also different than the air from the cooling device (heatsink/radiator). Wouldn't the lower the the the temp of the item being cooled mean more heat is being removed assuming everything else is the same?

It doesn't mean more heat is being removed, because the amount of heat is dependent upon the GPU load.
It does mean that the GPU would be cooled more efficiently (requiring less fan speed).
 
Correct. Now lets discuss enthalpy; the room will heat up less transferring 55c than it will at 70c. Losses to ambient and forced cooling from say, an air conditioner are less difficult to manage.

I'm going to assume OP has air conditioning...

Also on the topic of enthalpy, enthalpy has nothing to do with the discussion, actually. It's the change in energy of a fluid (in this case air) to keep the fluid at constant pressure.
The room isn't sealed, therefore pressure will stay equal without any care as to the amount of energy stored in the air.
 
You're approaching this discussion from a closed system perspective. A closed room with no air condition if you will.

Using furmark benchmarks ->

a GTX 970 produces 284W.
a R9 290 produces 355W

a 71W differential @ 100% loading

Applying your equation, that changes the differential to .00179F/s. In this case, 6.44F per hour differential.

Assuming an air conditioner set to 72F, using average central air conditioner settings (3500 watts), you will need an extra 324 "watts" of cooling per hour (theoretically speaking). The average household runs their air condition for 4 months a year.

Adding further,

71W power draw differential, running 8 hours a day, every day at an average of 12 cents per kw/h will cost you a whopping extra $24.88 a year.

Adding in the cost of electricity of air conditioning for 6 months thats $37.84 a year in extra cooling costs. Totaling a higher net operating cost of $62.72 a year.


Right now, an R9 290 on sale is around 240$. A GTX 970 on sale is around 320$. That is a solid 80$ difference. It will take 1 year and 4 months to make the difference up in electricity costs.

Over a 3 year life span of the card, the 290 will cost you an extra $5.22 a month to operate.




NOW, that is the card running at 100% ALL THE TIME. At idle, the differences drop to 9w differential. Keeping the math the same, that drops the operating cost differential to $7.52 per year! At idle, the 970 would have to be in operation for TEN YEARS to see a net profit.

Assuming an average household gaming use of 50%, that would change the numbers to $35.13 a year. This makes the cards really almost neck and neck for value over a 3 year life span.


Theres the math. If you game less, or use games that dont pull 100% gpu usage, then the math tips in the 290's favor. If you live in a hot climate and run the air con more than 4 months a year, the math tips into the 970's favor. OBVIOUSLY, we're using averages and constants that don't necessarily apply here, but the general idea is there.

edit: I apologize for my "backwards" thinking earlier, simply musing on temperature differentials, not thermal power. Its late and i've had a few cocktails. If im incorrect anywhere, please correct!
 
Last edited:
Furmark is a terrible test to use. It's a power virus which you cannot get accurate power use from due to the tthrottling built in to both cards. I'd stick to TDP...Tthermal Dynamic Power values.
 
Bob, nowhere was I talking about cost of ownership. At all. Only the amount of heat dumped into the room.

Also, as EarthDog said, Furmark is not an accurate test.

So, I have no idea where you were going with that post as it has exactly nothing to do with the rest of the discussion...

Another point, I never did differential in the two cards, I simply did change in room temp per time given a 250W source of heat.
 
Back