Does Furmark really kill video cards?
View Single Post
05-26-11, 10:49 AM
Innocent Senior Member
Join Date: May 2009
Power draw kills 570s and 590s.
Furmark kills cards, it's a proven fact.
Unlike a CPU, a GPU is not designed to be 100% utilized. The thing to keep in mind is that a game that shows 100% utilization is not the same as a math problem that actually fits into the couple k of L2 cache in each shader, most of that 100% utilization in games is spent waiting for data to show up.
That's what makes Furmark (and OCCT GPU) so nasty, the cores aren't waiting for data, they're all doing something.
It's much like how Linpack/IBT gives higher temps than Prime does, Linpack is designed to minimize wait states.
In any case, it is very much not FUD, nor is it entirely heat related, except in that a saturated mosfet or trace generates
more heat than that same mosfet 00.5% less loaded. We're talking a difference along the lines of the mosfet keeping 5w and the mosfet keeping 25w here. Very large.
That's what happens to gtx570 mosfets and gtx590 mosfets, the core(s) demand more currant than the mosfets can cough up, and the extra demand turns into massive, massive heat in the mosfet, much more than can be removed via a standard heatsink. That, in turn, equals BOOM!
"Two things are infinite: the universe and human stupidity; and I'm not sure about the the universe." -- Einstein (maybe)
How to check your PSU with a multimeter.
View Public Profile
Find More Posts by Bobnova