Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
Exactly!
I read that somewhere too and I wish you bookmarked it because I forgot to! And now I don't remember if the source of that information was reputable, or biased.
Frankly, I find it very frustrating HWiNFO64 (my favorite, by far), HWMonitor, Speccy, CPU-Z and _________________ (fill in the blank) do not all report the exact same thing. It makes no sense to me they don't. The temperature of something is not a subjective value. If I put 10 different technology-type thermometers (digital, analog, mercury, resistance, thermocouple, infrared, bimetal, liquid crystal, younameit) in my oven, and it is 350° degrees in there, they should all read ~350°. And they likely will! If I put 10 thermometers in my hallway and set my furnace thermostat (located in same hallway) to 70°F, all 10 thermometers should read ~70°F. And they likely will.
All these hardware monitoring programs are getting their information from the exact same sensor. So why would the temperatures be different? Its the same problem with voltages. Why would two programs report different voltages? 60°C is 60°C. Period. +12.1VDC is +12.1VDC. Period. It does not (or should not) matter how that 60°C or +12.1VDC is measured.
Those sensors produce a specific numeric value that represents a specific voltage or temperature or fan speed. Why don't those programs use the same formula to display the true value? It makes no sense to me. I understand sample rates will be different. And sample times will be different so a couple degrees, RPMs, or 1/10s of volt variance should be expected. But way off? Doesn't make sense to me.
Another problem is a total lack of industry standards here. Even within the same brand!Different labels are used. Different sensor locations are used. So are we talking the same thing, or not? If this CPU temp sensor is located deep inside the core at a junction, and that CPU sensor is located on the case (or is it IHS?
), which is real? Which is better? How does one compare?
If pros find it confusing, it is no wonder the less experienced do.
And what are offsets and why are they used? A temp is a temp. It seems to me offsets are there to make the temp "look" cooler than it really is. Why? That makes no sense - unless the purpose is to deceive consumers.
Real Temps and Offsets explained. Oh, I totally understand now!
[rant off]
The whole point I was trying to make is I will not hesitate to use AMD, ASUS, Seasonic, etc. if they offer the best choice at the time I am spending my money. And that, IMO, is what fosters competition.
Right, that, and Johan45's explanation make it clear - or clearer. But I was really speaking hypothetically.My understanding is that in some cases at least the offsets serve the purpose of evening out fan speed control in bios between different processors.