• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

is i7-6700k worth $50 more than i7-6700?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

goto35

Registered
Joined
Feb 16, 2016
The non k is $50 cheaper. Just debating if the $50 is worth it for overclocking?

Will be using primarily for gaming and video/photo editing.

Will be connected to my 70 inch 4k tv as my main monitor.

Will have cross fire 290x along with samsing 950 drive.

What do you think!?

- - - Updated - - -

Forget it just saw the speed difference. Will be getting the k.
 
Well, the non-k has a locked multiplier so you are limited to how you can OC.

If you want to maximize your OC then yes it is. If you're just looking for a slight bump, probably not. It's kind of a personal preference thing.
 
K has been ordered! My budget has double since i have the smart idea of building this gaming cpu.
 
The only thing I would change is the Cross Fire 290x, I would go for a single R9 Fury, unless most of the games you are going to play support Cross Fire.
 
The vast majority of games support multi gpu. Scaling varies, but the support is there with most titles.
 
Not as big of advocate of XFire but have been running SLI since the 6K series and on and of XFire since 3K series with zero issue other than a few drivers not ready for prime time

the (K) will have better resale when the time comes , good choice
 
Not as big of advocate of XFire but have been running SLI since the 6K series and on and of XFire since 3K series with zero issue other than a few drivers not ready for prime time

the (K) will have better resale when the time comes , good choice
I agree I am very fond of SLI. I second changing out the AMD graphics chips for Nvidia instead.

I have noticed that some games are not making use of SLI out of the box but SLI can be forced. For example X-Com 2 runs 30FPS with V-Sync on with one GTX 970 on my machine with max settings at 1080P. If I set the profile for the game to use Forced Alternative Rendering 2 I hit the ceiling for frame rate with V-Sync on. Whats more the frame rate is silky smooth just as I would expect. SLI hasn't let me down yet.
 
I agree I am very fond of SLI. I second changing out the AMD graphics chips for Nvidia instead.

I have noticed that some games are not making use of SLI out of the box but SLI can be forced. For example X-Com 2 runs 30FPS with V-Sync on with one GTX 970 on my machine with max settings at 1080P. If I set the profile for the game to use Forced Alternative Rendering 2 I hit the ceiling for frame rate with V-Sync on. Whats more the frame rate is silky smooth just as I would expect. SLI hasn't let me down yet.

Have you tried Adaptive V-sync? you could have Higher FPS.
 
Have you tried Adaptive V-sync? you could have Higher FPS.
How? Adaptive or regular Vsync limit to your monitor's refresh rate. Adaptive is more gentle BELOW that FPS....

... or are you saying since his rig can't hit 60 to use adaptive?
 
How? Adaptive or regular Vsync limit to your monitor's refresh rate. Adaptive is more gentle BELOW that FPS....

... or are you saying since his rig can't hit 60 to use adaptive?

Mine can hit 60 FPS all day long when SLI is running but without SLI my graphics are meh.

I didn't try adaptive V-Sync because thankfully I was able to force SLI to be used and get my full graphics speed. Once I hit 60 FPS I stopped worrying about making the performance better because it doesn't dip below 60 FPS even when doing a shadow recording in the background.

Although I am considering upgrading the graphics. The two GTX 970 cards are great when they are in SLI mode but if a game doesn't run SLI I get stuck in the 30 FPS realm at max settings. This is probably getting out of the scope of this topic though. I'm going to go browse on what is coming down the pipe with NVidia.
 
Last edited:
Mine can hit 60 FPS all day long when SLI is running but without SLI my graphics are meh.

I didn't try adaptive V-Sync because thankfully I was abled to force SLI to be used and get my full graphics speed. Once I hit 60 FPS I stopped worrying about making the performance better because it doesn't dip below 60 FPS even when doing a shadow recording in the background.

Although I am considering upgrading the graphics. The two GTX 970 cards are great when they are in SLI mode but if a game doesn't run SLI I get stuck in the 30 FPS realm at max settings. This is probably getting out of the scope of this topic though. I'm going to go browse on what is coming down the pipe with NVidia.
Part of the reason I asked Wingy why...but you do have that solution if you at one card. :)
 
Part of the reason I asked Wingy why...but you do have that solution if you at one card. :)

Good to know :D because V-Sync is brutal to the FPS but tearing makes my eyes bleed. LOL

Well regular V-sync on and off works in steps.

Interesting that makes sense. With SLI off and V-Sync off one GTX 970 in my machine can run X-Com 2 at max settings at a resolution of 1080P at around 40 to 45 FPS.

So with SLI on it can hit 60 FPS no problem because effectively the two cards are working in tandem like a tag team. That explains the frame rate of ~82-88 FPS with SLI on and the V-Sync disabled. The overhead of forcing SLI is not very much in this case. However forcing mode 1 instead of 2 gets a frame rate of 12 FPS so it is not so good. LOL
 
Last edited by a moderator:
Well if you have a monitor with 60Hz I would use Adaptive V-sync all the time that way it will just keep the FPS at 60 to 40 FPS for you, unless you have stutter and need it to be either 60 or 30FPS.
 
He just uses SLI to reach 60 FPS and Vsync/Adaptive. I would use adaptive anyway as there is less input lag vs regular VSync.
 
Input lag is from game stuttering, v sync helps and is older tech. Adaptive v-sync is newer tech that does not induce stutter by not dropping from 60-30 FPS, there is no lag from adaptive V-sync it just limits the frame rate to the monitor refresh rate. G-sync is the newest tech that locks the monitor refresh rate with the GPU .
 
We went over this already I thought... there still is input lag with Adapative Vsync. It is just less than regular vsync. At least, I sure notice some between all 3... off, adaptive, and regular vsync. I wou;dnt bet my life on it, LOL!
 
Last edited:
Back