• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX580 throttling question

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

GT240

New Member
Joined
Oct 5, 2010
I just read that the GTX580 throttles down the GPU if it gets too hot.

If this is true, what performance drop does it do if this does happen?

I watched a video on youtube when they ran Furmark, and the temps never went past 69c, which I find to be impossible. The 10-15% increase may go away if it throttles the GPU. What is the temp on which it does this?

The GTX 480 hits 92c, which makes more sense....and is crazy hot, but you're getting full GPU performance out of the card...
 
I just read that the GTX580 throttles down the GPU if it gets too hot.

If this is true, what performance drop does it do if this does happen?

I watched a video on youtube when they ran Furmark, and the temps never went past 69c, which I find to be impossible. The 10-15% increase may go away if it throttles the GPU. What is the temp on which it does this?

The GTX 480 hits 92c, which makes more sense....and is crazy hot, but you're getting full GPU performance out of the card...

There is a special chip on the 580 that will throttle it down if it detects Furmark or OCCT is being run. From early testing and what I have read...it in no way effects normal benchmarks or overclocking.

Personally, I'm still not sure how I feel about this. Apparently Nvidia's standpoint is that Furmark and the like place unrealistic loads on the GPU. There has been instances of Furmark and the like killing cards...this is why they did it.
 
There is a special chip on the 580 that will throttle it down if it detects Furmark or OCCT is being run. From early testing and what I have read...it in no way effects normal benchmarks or overclocking.

Personally, I'm still not sure how I feel about this. Apparently Nvidia's standpoint is that Furmark and the like place unrealistic loads on the GPU. There has been instances of Furmark and the like killing cards...this is why they did it.

Hmmm... I'm not liking it at all, since now Nvidia can control how the card uses its overall hardware capability. Its hardware and not software control? Hmm.. But you say it "should" not affect games... no way to know, right? Is there proof that the card hits higher than 69c with a load?
 
Hmmm... I'm not liking it at all, since now Nvidia can control how the card uses its overall hardware capability. Its hardware and not software control? Hmm.. But you say it "should" not affect games... no way to know, right? Is there proof that the card hits higher than 69c with a load?

I have mixed feelings about it too. For the same reasons as you...what's next?

"Nvidia has determined that the Frostbite Engine places unrealistic loads on a GPU, therefor we have hard wired our products to down clock when playing BFBC2."

Now, that ^^^ is complete rubbish I just pulled out of my butt...however, that is my fear if this becomes the norm for GPU makers.

It is hardware. It can be disabled, but only with a hard mod. I assure you, the card will get hotter than 69c, though I haven't personally done it.
 
Personally? I wouldn't be concerned with them taking it to that level.

I think they just don't want people frying their cards.
 
I have mixed feelings about it too. For the same reasons as you...what's next?

"Nvidia has determined that the Frostbite Engine places unrealistic loads on a GPU, therefor we have hard wired our products to down clock when playing BFBC2."

Now, that ^^^ is complete rubbish I just pulled out of my butt...however, that is my fear if this becomes the norm for GPU makers.

It is hardware. It can be disabled, but only with a hard mod. I assure you, the card will get hotter than 69c, though I haven't personally done it.

Good see see you're on the same page as myself... I like to see the entire picture...

The new cooler technology is just a little misleading don't you think? Has nothing to do with it, its slowing down the hardware to keep the temps in line.. I thought that's what the fan is for...? haha, unless it just too small and can't handle the temps. I'm not 100% convinced yet.. I just seems to be a GTX 480 with more cuda cores, a modified HSF, and with GPU throttling. A modified GTX480 to say the least..
 
Personally? I wouldn't be concerned with them taking it to that level.

I think they just don't want people frying their cards.

This is probably more accurate than anything. I'm not trying to start a conspiracy theory.
 
Good see see you're on the same page as myself... I like to see the entire picture...

The new cooler technology is just a little misleading don't you think? Has nothing to do with it, its slowing down the hardware to keep the temps in line.. I thought that's what the fan is for...? haha, unless it just too small and can't handle the temps. I'm not 100% convinced yet.. I just seems to be a GTX 480 with more cuda cores, a modified HSF, with GPU throttling. A modified GTX480 to say the least..

The 580 is what the 480 would have been if Nvidia wasn't playing catch up all year. They had to get something out and the 480 was rushed. I'm not a fanboy, I beat up red cards and green cards, but it is what it is.
 
This is probably more accurate than anything. I'm not trying to start a conspiracy theory.

But 10-15% may not be the correct numbers... without the throttling it could be more like 15-20%. Having a governed video card is not good for the highend gamer... I want all of its horse power, not part of it...
 
But 10-15% may not be the correct numbers... without the throttling it could be more like 15-20%. Having a governed video card is not good for the highend gamer... I want all of its horse power, not part of it...

From what I have seen on HWbot and what I have heard from OCF benchers and team PURE it has no effect on benchmarking or overclocking. It only kicks in when it detects Furmark or OCCT.

I've even heard rumors of AMD doing the same thing. In both companies defense, Furmark can kill a modern high wattage card...they are probably just trying to stop unneeded RMAs. :shrug:
 
ATI has been throttling in Furmark for a while. Seems it's caught on with Nvidia now too..
 
ATI has been throttling in Furmark for a while. Seems it's caught on with Nvidia now too..


lol... However I don't think you're going to see a new single gpu card from AMD lose in a few benchmarks to its predecessor... Looks like the 580 needs to be redesigned once again...Maybe loosen how strict it is on the thermals...

I'm running the latest driver (262.99) on my GTX480, with a modded inf file. I can run the 580 demo's from nividia, and I see all the eye candy.. And it runs very smooth. Just like the 580....
 
The throttle seems to be less strict in games than furmark, but yea, I was like "oops..." when seeing those tests... it does need to be loosened for some games.

It's also using less power with way less noise/heat.
 
There shouldn't be any throttling in any game. If you find that there is please post proof...the bench team needs to know.

Guys, just cause the 480 lightning keeps up with the 580 in some benchmarks doesn't mean it's throttling. The 580 is NOT a new card really...it's a tweaked 480, nothing more nothing less. Think revision 1 vs. revision 2. Not new card verses old card. Furthermore...lightnings are very tweaked cards...one needs to compare a standard 480 to a standard 580 to see an apples to apples difference.
 
that chip is just a money saver for them, there is no way to tell if a chip fried because someone ran furmark . . . they put that chip so you don't fry your card with stupid temp benchmarks, make em fold for 10 mins, that temp is most likely what you will get in game . . .
 
There shouldn't be any throttling in any game. If you find that there is please post proof...the bench team needs to know.

Guys, just cause the 480 lightning keeps up with the 580 in some benchmarks doesn't mean it's throttling. The 580 is NOT a new card really...it's a tweaked 480, nothing more nothing less. Think revision 1 vs. revision 2. Not new card verses old card. Furthermore...lightnings are very tweaked cards...one needs to compare a standard 480 to a standard 580 to see an apples to apples difference.


The GTX 480 uses the GF100, the GTX580 uses the GF110. Its new silicon, so it may seem to be the same, but really different... I don't own the 580 yet, but will require more investigating before that happens... I'm sure as the card becomes more in the wild the more problems may come out...
 
ATI has been throttling in Furmark for a while. Seems it's caught on with Nvidia now too..

Yeah but ATI's is software based. Its embedded into the drive like a profile for a game. If you switch the exe files name you can get her nice and warm.

Thing with Furmark is that it puts huge load on the card, sure to stress test it and all but it goes well beyond spec even TDP on the cards which is why they started doing this.
 
I've read that there are a few people trying to find a way to flash their GTX480 to a 580...
 
I've read that there are a few people trying to find a way to flash their GTX480 to a 580...

Thats not a good idea...while they are very similar...they are different enough to make that most likely impossible. Good way to brick a card.
 
Back