• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Overclocking EVGA GTX 1080 FTW

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Sounds more like a PITA than anything to me. :)

I wonder if they had issues with the switch method?? But if so... I would imagine mobo mfg and other GPU makers would be experiencing the same problem...

First I have heard of that being a requirement though...thanks!
 
It seems that I am maxed out on my overclock as if I go beyond +125MHz Heaven Benchmark crashes.

Also, I received better overall score with the first BIOS, but I got a better min frame rate with the second BIOS.

1) Should I use the first or second GPU BIOS?
 
I prefer a better min frame rate myself (aka stabilized GPU frequency)...up to you.
 
It seems that I am maxed out on my overclock as if I go beyond +125MHz Heaven Benchmark crashes.

Also, I received better overall score with the first BIOS, but I got a better min frame rate with the second BIOS.

1) Should I use the first or second GPU BIOS?
Its up to you... I highly doubt you will ever notice a difference between them without a FPS counter going, honestly. Try it out and see. ;)
 
I decided to stay with the second one and get the better FPS.

Looks like I might be maxed out at +125. Should I up the volts to go higher and how much do I increase it?
 
The volts can only increase by .03, so it will likely not make much of a difference, but you can do it.
 
Upping the voltage may help you to sustain the higher core clocks for a tad bit longer, but....it's also going to generate heat faster. It's a double edged sword. You have to find the balance between sustainable core clocks and temps, or at least, the highest sustainable core clocks for the benchmark you're running. With Firestrike you'll be able to sustain the higher core clocks throughout the benchmark, because they're relatively short. Timespy, however, takes longer, and your clocks will reduce to account for the build up of heat by the end of the bench. Heaven and Valley are even longer, so expect quite a bit of drop off by the time it's done running.

Very simply.....the GPU core while cool runs more efficiently and will need less volts to run a specific clock. As the core heats up it becomes less efficient and will need more volts to sustain that clock...which in turn causes more heat, and at some point it'll downclock itself another step. So, by raising the voltage to it's maximum available voltage, it'll be able to sustain the clocks a little longer even with the GPU heating....but, again....it will eventually get to the point that, that clock will need more voltage than you can give it, and it'll downclock again.

Watch the voltage and clocks the next time you run Heaven....you'll see what I mean.

I also have a couple of the 1080 FTWs. I've never had to disconnect the power from the GPU to get the switch to work. It make sense, though. I have had it take a couple of power cycles to get it to take effect.
 
Hmmm.... I think I will leave the voltage alone and settle with
GPU Clock = +125
Memory Clock = +50

So how do I know what exact speed my GPU is at? It seems to fluctuate... for example:
At idle
GPU Clock = 1379MHz
Memory Clock = 5054MHz

During testing
GPU Clock = 2088MHz/Heaven Benchmark was showing 2138MHz
Memory Clock = 5054MHz/Heaven Benchmark was showing 5055MHz

1) So is the idle clock the speed and the game clock is the boost speed?
2) Precision X OC is telling me one clock speed and Heaven Benchmark is telling me another. Which one is giving me the true clock speed?
 
Hmmm.... I think I will leave the voltage alone and settle with
GPU Clock = +125
Memory Clock = +50

So how do I know what exact speed my GPU is at? It seems to fluctuate... for example:
At idle
GPU Clock = 1379MHz
Memory Clock = 5054MHz

During testing
GPU Clock = 2088MHz/Heaven Benchmark was showing 2138MHz
Memory Clock = 5054MHz/Heaven Benchmark was showing 5055MHz

1) So is the idle clock the speed and the game clock is the boost speed?
2) Precision X OC is telling me one clock speed and Heaven Benchmark is telling me another. Which one is giving me the true clock speed?

Open up GPUz to the sensors tab, and monitor clocks / voltages / temps with it. The benchmarks don't read clocks right....Xoc will be right, but it's better to use GPUz. It gives you a lot better information with which to fine tune your overclocks.

Like this:

VxbyDG8.jpg
 
I have been using GPU-Z and looking at the sensor tab so I guess you answered my question to which one is telling me the accurate reading.
 
Yeah...the Unigine benchmarks don't read clocks accurately. The only time I noticed that it was reading them right, was with a modded bios on the Maxwell GPUs.....haven't figured out why, though. Anyway....g'luck with your overclock. I hope what I shared will help you out.
 
Back