• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Does ATI Tool read GPU temp properly?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

GTFouts

Member
Joined
Dec 12, 2004
Location
The Sticks of the Boonies, GA.
I have been running my WC setup for a few days now and according to ATI Tool my GPU temp never strays from 50c no matter what happens. It can be sitting idle, playing a game or whatever and it stays right at 50c. This doesn't seem right to me and I was wondering if anybody else has had this happen to them. I am using a MAZE4 from Danger Den. The TDX on the CPU works excellent, idles at 35c and maxes at 42c, but the X800XT just stays at a flat 50c. I tend to believe that is a bit high for WC and am starting to think the prg is not working properly.

Are there any other prg's out there that will display the temp of the GPU so I can see if it really is 50c? I've been looking at can't find any with a temp output for ATI cards.
 
riva tuner also has temperature monitoring. you might want to try using that program to see if it gets the same readings. (http://www.guru3d.com/rivatuner/) However I have also heard the resistor (maybe inductor? i can never remember) that generates a voltage to determine the temperature can be very innacurate.
 
It is a very good policy to never trust *ANY* on-die temperature sensors, or motherboard temperature/voltage sensors. The software is simply too unreliable, and the readouts too prone to manipulation by the software. Try running MBM5, and any other hardware moniter together, and compare the readouts - you'll see what I mean.

My video card doesn't have an on-die temperature sensor, but if it did, I wouldn't trust it either.

Personally, I only use MBM5, or any other "sensor" programs, for a very rough idea of the temperatures and voltages my system is probably seeing. I cannot stress enough that you need to take these readouts with a huge grain of salt.


If you want a semi-accurate idea of your GPU temperature, you really do need to drop $20-$100 into an external temperature sensor. Although these often have a good ~3 degree celcius margin of error, they are far more reliable and accurate then hardware monitering, at least in my personal experience.

It's also a lot easier to check an external temperature sensor for accuracy, then it is to check your on-die, or hardware sensors for accuracy. In fact, it's almost impossible to test on-die, and hardware sensors for accuracy.

The Enermax temperature sensor that I use is good down to -20 degrees celcius, and accurate within ~2 Degrees Celcius; I have tested it thoroughly to confirm that it is giving me accurate readouts. Because I have tested it, I can trust it, and rely on it to protect my hardware.

Because I cannot test them, I cannot rely on, or trust my hardware sensors.
 
Back