• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

6600K Preset overclock to 4.4 Works Flawless. Can't go any higher

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Again unless the RAM is bad it should NOT give you a ~100fps loss regardless of being single or not. Apparently there was some sort of patch roughly 1 month ago that crashed FPS, might it be your issue ?

- "I was experiencing the frame drops during the infamous frame destroying patch"
 
Honestly.. if I was you, I would return/sell everything you have and buy 2x8gb ddr4 3000 cl14/15. 4x4 leaves no room for expansion and is again a curious purchase. Ask us next time first! :)

My father own a repair shop, I'll just give him the old RAMs. I already ordered Kingston XMP DDR4 3000Mhz rams. I don't think I'll need more than 16GB rams anytime soon. By the time I might need them we'll probably have DDR5 anyway :D

When I ordred the single stick 16gb, I was really rushed, I blame it on that :D

- - - Updated - - -

Yeah I know prime95 and furmark aren't considered realistic tools, but you can't deny that if your CPU can withstand 12 hours of Small FFT test on prime95, there is not much of a chance for any other application causing problems.
 
BTW, 4 sticks usually give lower overclocking results than 2 (the IMC beingstressed more).

Sure you did not rush on this one either? ;)
 
I didn't say P95 was overkill... I said Furmark shouldn't be used. ;)

Prime 95 is fine.

BTW, 4 sticks usually give lower overclocking results than 2 (the IMC beingstressed more).

Sure you did not rush on this one either? ;)
it will still hit xmp... he may need more SA voltage though...

... feels rushed again, yeah.
 
Again unless the RAM is bad it should NOT give you a ~100fps loss regardless of being single or not. Apparently there was some sort of patch roughly 1 month ago that crashed FPS, might it be your issue ?

- "I was experiencing the frame drops during the infamous frame destroying patch"

I think a single stick ram can bottleneck the CPU pretty seriously. We'll just have to wait and see :)

My issue is not only limited to overwatch, the first game i tried on this new system was Far Cry 4. In the town at the beginning my FPS would dip down to 30-35. And in general the game wasn't running anywhere near 60fps on ultra. At the time I thought it was just the trademark ubisoft ****ty console port issue. Now I look at other benchmarks like this one

http://www.notebookcheck.net/Far-Cry-4-Benchmarked.131069.0.html

and it's clear that something is seriously wrong with my system.

The other clue that shows my problem is related to CPU or RAM is the furmark benchmark score. When testing with 1080p preset test, I get 118FPS average. And the score is pretty much on par with GTX1080 Foudners Edition. So why am I getting so bad performance in real games? Because furmark doesn't use ANY CPU or RAM. It's nearly pure GPU test.

- - - Updated - - -

BTW, 4 sticks usually give lower overclocking results than 2 (the IMC beingstressed more).

Sure you did not rush on this one either? ;)

Yeah I saw those benchmarks, but the performance loss is <1% in most cases. I can live with that to see all my DRAM slots filled :D OCD strikes agian...
 
Because furmark isn't a "real game".

As far as single vs dual..http://www.gamersnexus.net/guides/1349-ram-how-dual-channel-works-vs-single-channel?showall=1

It doesn't matter much, but you want dual. :)

I read that article before. I don't think it applies to my situation. He test whether there is a performance increase to be had with dual channel. My question is whether single channel ram is bottlenecking my system.

Think about it this way, I can theoretically run my 6600K @4.4Ghz using the stock intel cooler. But, as soon as the temps reach 90C the thermal protection thing would kick in and floor the clock speed. Technically speaking, I'm running my CPU @ 4.4Ghz with stock cooler. But the reality is the thermal throttling is making sure that i won't get the same benefits as using a aftermarket cooler.

What I believe to be happening is that either because my current rams ****ty specs or because the low bandwidth afforded by single stick is preventing the CPU from working efficiently.

Here is something else i just remembered, while messing around with overclock settings, I once tried to play Overwatch. And I was getting insane stutters and my CPU loads were locked to 100%. I went to the bios settings and disabled the RAM overclock (which was set to 3000mhz with quite high timings) after that, Overwatch FPS increased to 140. I don't know if this inciden is related to what we are discussing right now. But I just wanted to throw it out there.
 
I think a single stick ram can bottleneck the CPU pretty seriously. We'll just have to wait and see :)

My issue is not only limited to overwatch, the first game i tried on this new system was Far Cry 4. In the town at the beginning my FPS would dip down to 30-35. And in general the game wasn't running anywhere near 60fps on ultra. At the time I thought it was just the trademark ubisoft ****ty console port issue. Now I look at other benchmarks like this one

http://www.notebookcheck.net/Far-Cry-4-Benchmarked.131069.0.html

and it's clear that something is seriously wrong with my system.

Have you re-checked your BIOS settings ? might be a simple thing as ram not getting enough voltage/BIOS not updated/XMP not working correctly. Or you might have a bad stick, have you ran any tests on it ?
 
I read that article before. I don't think it applies to my situation. He test whether there is a performance increase to be had with dual channel. My question is whether single channel ram is bottlenecking my system.
Umm.... its a SINGLE vs DUAL channel article bub. :)

It has nothing to do with the specs of the sticks.
 
All I did is change the CPU upgrade mode to i7 6700k 4.5GHz then vcore "normal" and increase offset by +0.075. I did nothing else, been ruining this way since June 24/7.

You should see vcore in CPU-Z when running prime95 = 1.272v to 1.332v

Woah this worked. Running prime95 for 15 minutes. So far so good. Core#1 reaches to 85C others are around 76. But I just turned on the AC and temps started to lower as I type. And there is no other program that stresses the CPU like prime95 anyway. So I'm not concerned about temps.

Although I'm still curios about why I can't manage to get the same results with manually overclocking. Wish there was a way to see what the preset values are.

Thank you very much for your help man. Much appreciated.

Now, I'll just wait for the new rams to arrive.
 
Have you re-checked your BIOS settings ? might be a simple thing as ram not getting enough voltage/BIOS not updated/XMP not working correctly. Or you might have a bad stick, have you ran any tests on it ?

I just loaded optimized settings and did the modifications wingman99 suggested. I got 4.5ghz right now and system seems to be stable. I don't know what I mess up when i try to do manual OC. But I'm happy with where the preset OC brought me. I think I'll just stay here.

The only thing is the Core1 is around 10C higher than other cores under load. Not a huge concern just curious.

As for RAM, I tried XMP and AUTO. They both seem to be the same thing.

Here comes the fun part.... Just played a 20 minutes Overwatch match. And realized I'm a ******.... The reason I was getting 140FPS while others were getting 230 was the GeForce experience had set the Resolution Scale to 150% :rolleyes: Once I changed it to 100% I got 230-260 FPS...

Although I'm pretty sure there is still something shifty going on. CPU is constantly under 90%+ load(which is normal for overwatch) but the temps don't go above 56C. Before you say it's the GPU bottleneck, GPU load is at ~80% and TPD is at 60%

I'm still thinking the RAM is causing some kind of bottleneck. I'll let you guys know when the new rams arrive.

Btw, can anyone suggest a good overall benchmark? The ones I have are mostly geared toward a specific part(furmark, prime95 etc.) I also have AIDA64 but that program seemed unreliable to me.

- - - Updated - - -

Woah this worked. Running prime95 for 15 minutes. So far so good. Core#1 reaches to 85C others are around 76. But I just turned on the AC and temps started to lower as I type. And there is no other program that stresses the CPU like prime95 anyway. So I'm not concerned about temps.

Although I'm still curios about why I can't manage to get the same results with manually overclocking. Wish there was a way to see what the preset values are.

Thank you very much for your help man. Much appreciated.

Now, I'll just wait for the new rams to arrive.

Yeah, previously I had an asus and an i5 3570K. Thing were far simpler with that mobo. Gigabyte seems to be dedicated to make things as confusing as possible. It took me a week to realize there was a 'normal' setting for Vcore. And another few days to realize that was corresponding to Adaptive in other mobos.
 
Umm.... its a SINGLE vs DUAL channel article bub. :)

It has nothing to do with the specs of the sticks.

We shall see in a few days, even if i won't get any performance upgrade, at least 4 modules would look more pleasing then a single one yeah?:clap:
 
Here comes the fun part.... Just played a 20 minutes Overwatch match. And realized I'm a ******.... The reason I was getting 140FPS while others were getting 230 was the GeForce experience had set the Resolution Scale to 150% :rolleyes: Once I changed it to 100% I got 230-260 FPS...

Although I'm pretty sure there is still something shifty going on. CPU is constantly under 90%+ load(which is normal for overwatch) but the temps don't go above 56C. Before you say it's the GPU bottleneck, GPU load is at ~80% and TPD is at 60%

I'm still thinking the RAM is causing some kind of bottleneck.

Not uncommon to have CPU spikes, World of Warcraft is running 90%+ on the CPU since the Legion patch and yet it doesn't really warm up so its only certain areas. Also not seeing CPU or GPU bottlenecking anything at 1080p, if i OC my system to the max (my 980Ti is usually at default OC because of the fan noise) i would have no issue hitting 250fps-300fps on Overwatch, so even though you have the 6600k you should be roughly in the ballpark.
 
Not uncommon to have CPU spikes, World of Warcraft is running 90%+ on the CPU since the Legion patch and yet it doesn't really warm up so its only certain areas. Also not seeing CPU or GPU bottlenecking anything at 1080p, if i OC my system to the max (my 980Ti is usually at default OC because of the fan noise) i would have no issue hitting 250fps-300fps on Overwatch, so even though you have the 6600k you should be roughly in the ballpark.

https://linustechtips.com/main/topi...k-will-bottleneck-a-gtx-1070-or-above/?page=1

This is interesting. Apparently in some gaming scenarios 6600k even with O.C can bottleneck the GTX 1070. And from what you are saying even i7 6700k might bottleneck... Man what am I to do? Use a GTX 1080 as CPU? :screwy:

http://wccftech.com/fallout-4-performance-heavily-influenced-by-ram-speed-according-to-report/

Also this link suggest that "faster ram won't affect your gaming that much" is wrong. As 2400MHz ram get 24% more FPS compared to 1600MHz

This is what they have to say:

Well, click on the shot above to see that – yes – faster RAM can make a difference in general gameplay, even with Fallout 4’s v-sync cap, and without an outlandish lack of balance in system components. Essentially, when the CPU is the bottleneck, faster RAM can provide an often dramatic increase in performance.
 
Last edited:
https://linustechtips.com/main/topi...k-will-bottleneck-a-gtx-1070-or-above/?page=1

This is interesting. Apparently in some gaming scenarios 6600k even with O.C can bottleneck the GTX 1070. And from what you are saying even i7 6700k might bottleneck... Man what am I to do? Use a GTX 1080 as CPU? :screwy:

What i see is someone trying to game at max settings and streaming at the same time on a 4 core, that is going to cause texture load issues if the CPU cant cope which doesn't happen on a i7 or x99. There is plenty of other videos with your setup or close that don't show those hiccups.


http://wccftech.com/fallout-4-performance-heavily-influenced-by-ram-speed-according-to-report/

Also this link suggest that "faster ram won't affect your gaming that much" is wrong. As 2400MHz ram get 24% more FPS compared to 1600MHz

Fallout4 is the only game that i know of that actually gets much higher FPS with faster RAM (thread below), most other games gain a few FPS with for example 1866vs2400vs3200vs3866. Not much but a few.

http://www.overclockers.com/forums/...-making-significant-differences-in-benchmarks
 
What i see is someone trying to game at max settings and streaming at the same time on a 4 core, that is going to cause texture load issues if the CPU cant cope which doesn't happen on a i7 or x99. There is plenty of other videos with your setup or close that don't show those hiccups.




Fallout4 is the only game that i know of that actually gets much higher FPS with faster RAM (thread below), most other games gain a few FPS with for example 1866vs2400vs3200vs3866. Not much but a few.

http://www.overclockers.com/forums/...-making-significant-differences-in-benchmarks

Matias_Chambers's last video with the Witcher 3 seems to says otherwise. I don't think that guy is streaming during that test.

The fact that faster ram CAN cause that big of an FPS increase is what really interested me. Otherwise I agree that in most cases there wouldn't be more than 1-2 fps difference between an average ram and a top of the line one.
 
Hear what he says after 2m10s :

4.5ghz 6600k can be WORSE then a 4.2ghz 6700k in quite a few games simply because of the extra threads and not only in terms of FPS, GTAV is one of them because its optimized for 8 threads.
 
Newer games, esp. large open-worlds, are going to benefit from hyper-threading. On-screen displays (GUIs), terrain caching, and other features are being coded to other threads to improve performance. RAM may have some minimal impact on that when passing data from the hard-drive to the CPU/GPU.
 
Hear what he says after 2m10s :

4.5ghz 6600k can be WORSE then a 4.2ghz 6700k in quite a few games simply because of the extra threads and not only in terms of FPS, GTAV is one of them because its optimized for 8 threads.

Digital foundry only showed 10 FPS less for a stock i5 6600k at 3.6GHz compaired to i7 6700k at 4.0GHz in GTA V and he said the i5 6600k can mach the i7 6700k with a little overclocking.

Hyper threading only adds a little percentage it is not real cores, it just threads into a single core when it can. The real difference from the i5 6600k and i7 6700k is clock speed.
 
Last edited:
Back