• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GTX 1060 3GB undervolting/OC random questions

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Max0r

Member
Joined
Oct 18, 2005
In my ever-ubiquitous quest to be fully prepared for summer bake sessions in my room by minimizing unnecessary heat wattage being put out by my new system, I've turned my attention now to my video card, a lowly (by your standards) PNY GTX 1060 3GB. With the jump from 7700k to 12700k, I've been playing Hellgate: London lately, sustained FPS went from 65-93 to about 250-430, and low spikes in massive battles went from the 29-43 range (WITH annoying blows to game responsiveness in the worst cases) to around 130-190 with absolute flawless responsiveness. It should be noted this game is single thread, and the 12700k has a 44% single thread boost over the 7700k. Under a 7700k regime, this card was fully saturating a CPU core and only using part of its GPU capacity. Under the new regime, it appears the 1060 GTX and one of the P-cores of the CPU are taking turns bottlenecking each other, generally hovering around fully saturated on each, the ideal scenario if there ever was one.

Embarrassingly enough, I've never played around with the video card settings, and now I think I should have. By now I could have saved a lot on power bills and heat misery from former summers :rofl: As I speak, I'm testing it in the game running at close to its peak stock settings frequency when under load of 1847 MHz, but instead of the normal voltage it applies, 1047 mV, it's at 850 mV. And I still haven't found the undervolt limit! To me this is insane. It's literally going from using around 110 watts to 69 watts, WITHOUT UNDERCLOCKING. What's even more insane about this is that if I just keep the stock settings but set the power limit slider to 50%, it uses just a few less watts, and will run at much lower GPU frequency. To be fair, given the relatively balanced nature of the GPU/CPU power for this game, even at that lowered frequency, reduction of FPS and such isn't more than 10-15%, which is negligible for this scenario.

As long as my games run "too fast" my ultimate goal is not just undervolting but underclocking, to minimize power draw. However, I still plan to experiment with undervolting at normal clockrates and even overclock rates if possible, as I'm sure at some point I will need to actually take advantage of the full performance of the card.

There are some things I'm wondering about.

1648180563023.png

1. Memory clock. I'm confused on this because it seems to be listed everywhere in terms of GBPS, as 8 or 9. I've also seen some people say their memory OCed to the, say for example, 9600 MHz range. And yet I've seen others say their memory OC was to the, typical example, 4700 MHz range. I'm almost wondering if the memory clock shown here is actually underclocked to begin with. Either way I play on testing how the memory clock affects performance/power usage.

2. What would a more average vs best-luck undervolt be for this GPU @ stock, underclock, overclock? I kind of feel like I'm having my cake and eating it too here. It just seems too good to be true, but maybe these results are typical, which in itself calls to question why potentially millions of cards are wasting so much power. Granted, the burden of QA can rise dramatically even if there's such big headroom for tighter settings.

3. What is your preferred method to test the stability of undervolts/OCs? For starters I've just been using Hellgate: London and the Heaven benchmark. They both appear to be taxing the GPU the same in terms of utilization and wattage, even though the game runs between 200-450 FPS, and Heaven, with the settings I'm using, is running between 40 and 95 FPS or so. Would furmark be a better test? I imagine with different GPU generations these things can change.

4. Have you found that sometimes in implementing custom undervolt/OC voltage/frequency curves, stability loss occurs in low-load situations, such as not having any 3D apps/games running, and the GPU closer to idle? I would like to push the power usage at idle and low load situations as far down as possible, especially given #5.

5. I have 4 monitors connected to this card, which seems to always use around 25-27 watts even at relative idle. When I only had one monitor connected, it was only using around 6-12 watts at idle. In both situations the GPU usage hovers close to 0 at all times. I tend to wonder if the power usage numbers are thus always being inflated because of this. This also means there is less headroom to reach the power limit if I set it low. It also seems like before using multiple displays on this, the card was downclocking a lot on GPU/memory at idle, but afterward it wasn't, which I see no good reason for.

6. If you have any other general insights that might be useful to me about this endeavor, let me know.
 
@ 1847 MHz Hellgate video bugged out at 825mV after only a couple minutes (video froze in place but the game kept running), moving back to 850mV for extended testing.
 
1. Memory doesn't typically use a lot of power...
1a. I'd imagine the difference in memory speeds are some using DDR rates others not. Multiply 4700 x2 and you're back in the 9x00 range, right?

2. Well, your test failed so....... :p

As far as tbe average, I dont know personally and every one will be different.

3. Furmark... no. Power virus...move on. I prefer to loop 3dmark firestrike or FS extreme. I'll also test in the game I play mkst frequently.

4. Yes.

5. Card needs more power to put an imagine on multiple monitors. It is what it is.

6. Adjust, test...rinse and repeat.
 
My Gigabyte 1060 6gb did 2135mhz/9800mhz (+170/+900), but I got lucky with the Samsung memory. Remember being pissed because I knew it could do more on the core but it was hitting all the power limits. Never really tried underclocking/undervolting it though, never thought it was worth it on account of it being very cool even going full throttle.

FireStrike and Superposition for testing, and don't forget another reason for wildly fluctuating FPS could be the fact you only have 3gb VRAM, depending on the game ofc.
 
Many say that a game called Control is the ultimate stability test, being the most sensitive to problems with any overclock, while everything else has no problems. I don't know if it's from a GPU/memory or both perspective.

Until then tentative results just from casually going about it without a huge amount of testing, these are points of tentative stability (lots of hellgate gameplay + a couple of those benchmark passes like heaven, superposition, firestrike:

850 mV / 1860 MHz (any higher clock fails)
800 mV / 1708 MHz (bumping the clockrate higher not tested yet)
774 mV / 1607 MHz (plan to test lower voltages here, 1708 MHz failed)

So far it's Hellgate where all the problems occur on failed settings, but that might just be because I elected to spend more time playing that rather than running those other things a lot.

Tried furmark just because. The power usage is insane. At 774 mV / 1607 MHz it still managed to use almost 95 watts while other stuff uses more like 65. Doesn't seem to get that hot though, but that's probably only because the ridiculously low voltage. Furmark looks like the go-to for heat/power testing. Eventually I'm going to try comparing which test is the most sensitive to instability. Will retest everything with Control if possible.

I'm just loving the insane power draw reduction from undervolting so far, not to mention heat reduction. Not gonna test overclocking for the time being since it's irrelevant for me now. But in the future if I want more performance I will.
 
Delete furmark... lol. Literally was called a power virus and throttles clocks. Run it at stock and watch what clocks it runs at to keep within the power limit. Last I checked on nvidia cards, it ran a couple hundred mhz lower than gaming clocks. In other words, it's not even testing the clocks you're playing at.

I dont know how it behaves when undervolting, but it's best to use s9mething that nv/amd called a power virus and recc9mends not to use (in review materials).
 
Borderlands 3 picks up memory errors like no other IMO, and it usually shows in a few minutes. My 3070 passes +1200 in Firestrike and Port Royal without issues, but I start seeing stretched textures in game, so I had to settle for +1100, the only game I've played so far that does this.
 
Delete furmark... lol.
Why would I delete such an amusing tool? :bday:

Run it at stock and watch what clocks it runs at to keep within the power limit. Last I checked on nvidia cards, it ran a couple hundred mhz lower than gaming clocks.
Yes, it is doing exactly that with stock settings. Normally using non-insane software it gets limited by the voltage limit and tends to peak out around 115 watts. With this it runs head first into the power limit, with the power limit set to maximum it tends to hover at almost 140 watts. ^_^ At stock power limit to stay within 125 watt constraints, it tends to hover around ~1650 MHz / 880 mV, while maxing out the power limit to 116% it hovers around ~1740 MHz / 943 mV.

In other words, it's not even testing the clocks you're playing at.
Oh, that's ok. When I'm doing my tests it locks only 1 clock/voltage into place when under any kind of decent load, the only exception would be if the settings are high enough that furmark is capable of maxing out the TDP limit. XDDDDDDDDDDDDDDD

In my case, the best "stable" undervolt I've been able to reach without really underclocking the stock boost speed, actually manages to stay under 120 watts with furmark. ONLY possible with severe undervolting!

2022-03-29_131031.png

All that being said, however, is it really a good stability test? Thermally/power-wise maybe. But so far I'm seeing no evidence that it's making any settings fail where other stuff didn't. But I haven't done much with it yet.

Speaking of which...

Borderlands 3 picks up memory errors like no other IMO, and it usually shows in a few minutes. My 3070 passes +1200 in Firestrike and Port Royal without issues, but I start seeing stretched textures in game, so I had to settle for +1100, the only game I've played so far that does this.

Downloading Borderlands 3 as well. I'll be messing with that + Control to see if I can make some "stable" settings fail.


——————————————————————————————————————————————————————————————————

On to other news

Unfortunately MSI Afterburner doesn't make it easy to go below 700 mV or 700 MHz on the GPU. If you try to drag the curve below 700 it almost always drags it right back up to 700. However... if you finagle things right, you can make it stick lower, much lower... So low in fact, that you literally can't even see a single point of the curve, and yet it applies the settings :rofl::rofl::rofl::rofl::rofl::rofl:

And it is because of this "discovery" that I have managed to find a couple other interesting settings.

I have one profile locked in for 50 MHz / 680 mV. Sadly I have no idea how to push the voltage lower. I'm surprised it went below 700. Nonetheless, even with 4 monitors attached, this managed to shave 16 watts off of idle desktop power usage to achieve a cool 12 watts idle. However, playing a full screen video can use 1/3 to 1/2 of GPU, and adds nearly 1 watt! It remains to be seen whether this setting can remain unsaturated if I'm messing with lots of applications on all the monitors! But it doesn't feel slow at all. Everything is silky smooth and responsive just like at normal speeds.

Interestingly, it is reported that the voltage, power, AND thermal limits are PERMANENTLY in effect at this setting. Hey, if that's what it takes to undervolt/clock that much, so be it. If I don't need to be burning an extra 16 watts out of games, why should I?
2022-03-29_134345.png

Thought I'd include a furmark test just for Earthdog. With furmark running, the desktop DOES lag.:cry: Using a GARGANTUAN 15 watts of power. UNACCEPTABLE

2022-03-29_135045.png

An additional setting of interest may be my potential minimum rock solid setting for flawless feeling performance in Hellgate: London. 700 mV / 987 MHz (3499 memory) -- What I've noticed is that even though my monitor has a 60 Hz refresh rate, and around 24ms gray to gray, which would potentially make it a 41 Hz monitor in some cases, when sustained FPS hovers in the 70's to 90's, the game feels obviously less responsive AT ALL TIMES (and of course it can dip much lower, even if only for a moment). It seems the minimum FPS (by which I mean the lowest it ever spikes momentarily) has to be above 80 to feel flawless, which results in averages being more in the 100's or 200's. This reminds me of when I had a CRT running at 75 or 85 Hz a long time ago, and noticed that if Quake 3 Arena went significantly below 120 FPS, it felt less responsive. For some reason 120 FPS was the magic number to feel perfect in that scenario. The feeling of responsiveness to input in games is clearly not hard-bottlenecked by visual refresh rate. It might be soft-bottlenecked though. I.e. maybe a certain number of FPS above it will be noticeable in terms of responsiveness improvement, but at a certain point, only a higher refresh/g2g will yield even more responsiveness at even higher frames. Of course, this effect itself would be bottlenecked by the human nervous system once these numbers get high enough.

At this optimized setting Furmark uses a pathetic 60 watts. PATHETIC!!!!!!

2022-03-29_141038.png

So at this setting the GPU is only using around 44 watts while the game still feels flawless. This is savings of 55 watts from stock!!!!!!!!!!!!!!!!!
2022-03-29_141431.png
stock
2022-03-29_141611.png

Those extra FPS ain't doing anything for me. However, there remains a possibility I may run into a situation that hammers the minimum FPS/responsiveness down hard enough that I may want to raise up the settings a little bit from 987 MHz. But I already played at those settings for quite a while, and my character's gear has tons of properties that make it explode into various kind of elemental novas when it gets hit, and even in some big fights with tons of effects flying around, I can't say I felt any difference compared to higher settings.

Tentative Conclusions

Video card underclocking/undervolting has been an amazing success for my goals of reducing heat output. In fact, these results may be enough to lead to a sizeable savings on electric bills over years, given that even idle computing is having 12 watts shaved off it as well. But the best thing by far is the reduction in heat output for my hotbox of a room. It also means more of my case/radiator fans can be completely off, EVEN in some games, which means even more power/heat savings. And of course, all of this reduces stress on all parts involved, lengthening lifespan.

There's only one problem...

How the hell do I actually set it up to automatically change these settings under differing loads between game/desktop? The only way I've been able to achieve those insane underclocks is by giving the curve a steep angle, then dragging way below the bottom of the window and hoping it sticks at whatever random value it lands on, which it usually doesn't (it just gets bumped right back up) :rofl::rofl::rofl: This is what we call ADVANCED underclocking. :rofl::rofl::rofl:

Surely there is software out there that makes it easier to do this????? :bang head:bang head:bang head:bang head:bang head
 

Attachments

  • 2022-03-29_135045.png
    2022-03-29_135045.png
    463.6 KB · Views: 7
  • 2022-03-29_141038.png
    2022-03-29_141038.png
    228.3 KB · Views: 6
Back