• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!


Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.


Mar 9, 2012
Ok... I have been waiting for the ASUS ROG STRIX-GTX1080-O8G-GAMING OC EDITION to be in stock at NewEgg for 2 months now and they are still not available. So, I went to Fry's Electronics today and they had the ASUS ROG STRIX-GTX1080-A8G-GAMING in stock. The difference is the O8G is beefier and OC'ed. So I decided to overclock my ASUS ROG STRIX-GTX1080-A8G-GAMING with the valuable information that I received when I was overclocking the FTW I had. This card overclocked really well... I maxed out at

GPU Boost Clock = 1972MHz
Memory Clock = 11010MHz

But GPU-Z is saying that the clocks are

GPU Boost Clock = 2050MHz
Memory Clock = 1377MHz

Why is the GPU-Z giving me such a low Memory Clock when I have it set to 11010MHz?
Do the math... 1377*4*2 = 11016 for GDDR5X. That is how it works. GPUz just shows the actual MHz, not the 'apparent' GDDR5X speed.

Also, what is your ACTUAL boost clock (check in the sensor tab after running a game)?
Last edited:
Ok... I still am not grasping the whole deal. So the actual memory clock is 1377MHz?

Also what is that equation 1377 is the MHZ *4? *2?
Correct, that is the actual MHz as shown in the first post.

* = multiply

GDDR5 was always multiplied by four (quad pumped), but now since GDDR5X allows for a quad data mode prefetch (slower than dual mode, I think 16ns vs 8ns) that is double the data rate so the 'apparent' clocks = MHz x 4 x 2.
Well after I run Heaven Benchmark it says the GPU Core Clock = 2050MHz

GPU Core Clock = 2050MHz
Memory Clock = 1319MHz (I lowered my memory clock)
I am meaning that GPU-Z shows that the card was running at 2050MHz while I was playing the game.... actually it just showed a max of 2088MHz.