• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FX - 8350 Should I overclock.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

motherboard1

Member
Joined
Jan 7, 2006
I'm contemplating a mild overclock (4.4 maybe) with my build, I've been on a stock cooler for the last few years and obviously I'd be looking into buying something simple like a 212 EVO, but even that's like $30 US + maybe more case modifications/fan installations. So there is really more than one question being asked here.

Beyond the expectations of 4.4 on my current board. There is also the question of 10% being worth it for the purpose of trying to break a CPU bottleneck in newer games.

I recently started Rise of the Tomb Raider, and for the first time I've found a game that runs all 8 CPU cores @ 60%-90% and bottlenecks my GTX980Ti @ 60% usage in some areas. 1080p or 3440x1440 +1.20 DSR factor, I'll get the same 30-35 Frame rate. I've not before this point experienced a game that would bottleneck the GPU below 60 fps With all the eye candy. This is more than a mild bottleneck.

So, I'm stuck wondering if I should expect more than a 10% gain in FPS with a 10% CPU overclock ? If the answer is no than clearly it's not worth buying gear for a mild overclock. I know nobody wants to guess what kind of performance gains this will yield, so I'll simply ask this. What would you do?
 
It's hard to say. I would think a lot of it depends on getting the most possible overclock out of your FX-8350. But expecting much overclock while only cooling it with a hyper 212 Evo is probably a pipe dream. Unless you are willing to pony up for a more expensive cooling solution, throwing down the $30 for the Coolermaster cooler you mention s probably going to be a waste of money. Something along the order of a Noctua ND14 or N15 would be the minimum starting point if you will be sticking with air. With that you could get a small to moderate overclock more than likely. But really, that CPU calls for a high end AIO water loop or better yet, a custom loop. And when I say a high end AIO water loop I mean one that has a three radiator fan. Otherwise you will get only marginal cooling improvement over top end air.
 
It's hard to say. I would think a lot of it depends on getting the most possible overclock out of your FX-8350. But expecting much overclock while only cooling it with a hyper 212 Evo is probably a pipe dream. Unless you are willing to pony up for a more expensive cooling solution, throwing down the $30 for the Coolermaster cooler you mention s probably going to be a waste of money. Something along the order of a Noctua ND14 or N15 would be the minimum starting point if you will be sticking with air. With that you could get a small to moderate overclock more than likely. But really, that CPU calls for a high end AIO water loop or better yet, a custom loop. And when I say a high end AIO water loop I mean one that has a three radiator fan. Otherwise you will get only marginal cooling improvement over top end air.

Even with such a project I'd be gambling on this motherboard though. And at the end of the day even if I were to hit 5ghz, hoping to get from 30 to 37.5 fps for all my efforts, if that's what I'm reasonably looking at, isn't worth it for my limited purpose.

CPU bottleneck.jpg

On a related note, can somebody explain to me what's going on here. I have trouble wrapping my head around whats going on in this graph from a popular youtube channel. It seams to imply that putting more work load on the GPU will actually increase frame rates in the event of a CPU bottleneck, but I haven't experienced this in testing, nor could I understand mechanically how it would ever work out that way.
 
with what you want to do I would go with an intel 4790k rig or better yet, skylake.
you will not get the throughput with the 8350 even at 5.2 ghz as with even a 4790K rig at 4.8.
I game gta5 on both an 8350 and a 4790K, my 8350 at 5.2 and my 4790K at 4.8, I'll take the 4790K rig any day.
the board you will need is an asus crosshair -5-f-z and then add 300-400 bucks for cooling the 8350, the cooling system for it will be massive, loud and outside the case and the 4790K will still perform better.
the board you have will FRY trying to make a topshelf game.
 
with what you want to do I would go with an intel 4790k rig or better yet, skylake.
you will not get the throughput with the 8350 even at 5.2 ghz as with even a 4790K rig at 4.8.
I game gta5 on both an 8350 and a 4790K, my 8350 at 5.2 and my 4790K at 4.8, I'll take the 4790K rig any day.
the board you will need is an asus crosshair -5-f-z and then add 300-400 bucks for cooling the 8350, the cooling system for it will be massive, loud and outside the case and the 4790K will still perform better.
the board you have will FRY trying to make a topshelf game.

Agreed! Save your pennies and save your dimes:
 
Last edited:
And at the end of the day even if I were to hit 5ghz, hoping to get from 30 to 37.5 fps for all my efforts, if that's what I'm reasonably looking at, isn't worth it for my limited purpose.
read my link... :)

10% of 30 is 3. So that's 33 fps. To get to 37 is 21%.
On a related note, can somebody explain to me what's going on here. I have trouble wrapping my head around whats going on in this graph from a popular youtube channel. It seams to imply that putting more work load on the GPU will actually increase frame rates in the event of a CPU bottleneck, but I haven't experienced this in testing, nor could I understand mechanically how it would ever work out that way.
as the resolution went up, putting more load on the gpu fps went down. The gap between the two cpus is less the higher the resolution as more is being done by the gpu.
 
Last edited:
read my link... :)

10% of 30 is 3. So that's 33 fps. To get to 37 is 21%.

I was doing 25% increase of 4.0 stock to 5.0ghz theoretical scenario.


as the resolution went up, putting more load on the gpu fps went down. The gap between the two cpus is less the higher the resolution as more is being done by the gpu.

Nope, look at the graph again, it's showing 2600k CPU min FPS 35 @ 1080p, and Min FPS 42 @ 1440p. Making it appear as though a higher GPU work load will actually alleviate some bottleneck in the Min FPS. I've not been able to achieve this kind of thing myself in testing and am sort of in disbelief that it would ever work that way, but it's not the first time I've seen somebody imply that raising resolutions will help alleviate a CPU bottleneck.

So I ask myself each time I hear something like this, why would Min FPS ever increase with a resolution increase. I would think best case scenario, min FPS would stay the same.
 
You said 4.5 earlier...but that isn't what I was talking about anyway... the fps increase is what I was on about. :)

What doesn't make sense? The load is handled more on the gpu the higher the resolution. Down low on a heavy cpu based game the cpu is too busy would be my stab at it.
 
So I ask myself each time I hear something like this, why would Min FPS ever increase with a resolution increase. I would think best case scenario, min FPS would stay the same.
I was puzzled by the same thing a while back and I will explain how I understand it. For starters a FPS increase will also depend on the GPU. Meaning if you have a budget Gpu you're not going to see an increase in FPS going from 1080p to 1440p. So lets take your GTX 980 Ti, these numbers I'm using are just for an example. Say it can put 200 Fps in X game at 1080p resolution. In order for it to get those Fps it has to be set up, so to speak, by the Cpu which gives it the info it needs. If the Cpu is at it's limit due to the lower Ipc (instructions per clock) of the 8350 the frame rate will not be 200. This is because the Gpu cannot get the instructions fast enough to produce an output which are the Fps. Now say at 1440p the 980ti can get 150 Fps if the Cpu can give that amount of info to the Gpu without being maxed out the Gpu will produce 150 Fps. Now the question is why can Fps increase, my only guess is that when the Cpu is maxed out it's getting bogged down by all the instructions waiting and therefore actually produces less Fps then if it were running at say 90% load.

I will note that I am no expert on this and this is what I've gathered from being interested on the subject.
 
I think it is also true that different parts of a game will draw more or less on the GPU v. CPU but the average frame rate for the whole game may still be improved by upgrading one of the other.
 
No matter the graphics(1080p, 1440p, 4k) your cpu sends the exact same info. The only difference is the amount of pixels. That is all handled by the gpu. So if you encounter a bottleneck at 1080p from your cpu. The graphics card can't give any more help. At the higher res however, if the cpu still bottlenecks your gpu has more leg room and can get at least the same frame rates.
 
Thanks everyone I think I've pretty much got what I came for here. I'll be thinking about moving to something Top end like maybe an I7 6700.
 
No matter the graphics(1080p, 1440p, 4k) your cpu sends the exact same info. The only difference is the amount of pixels. That is all handled by the gpu. So if you encounter a bottleneck at 1080p from your cpu. The graphics card can't give any more help. At the higher res however, if the cpu still bottlenecks your gpu has more leg room and can get at least the same frame rates.

Yes, I understand that but my point is that is not necessarily the case that the CPU is being maxed out throughout the whole game.
 
This I understand, and at those times that are more gpu intense vs cpu intense the system will not be bottle necked as long as the gpu can handle it
 
Is it safe to assume I would be using less power under full load if I move to an I7 6700k ? . Unless I'm overlooking other factors we have TDP 93 watt vs TDP 120.
 
Is it safe to assume I would be using less power under full load if I move to an I7 6700k ? . Unless I'm overlooking other factors we have TDP 93 watt vs TDP 120.

Yes, that would be a reasonable assumption. And the difference becomes more pronounced as you overclock. That's why you have to get such heavy duty and expensive motherboards when you are overclocking the 8 core FX chips. They can draw 250w or so when significantly overclocked.
 
Is it safe to assume I would be using less power under full load if I move to an I7 6700k ? . Unless I'm overlooking other factors we have TDP 93 watt vs TDP 120.
Yes, I've done some testing on my setups with a Kill A watt meter. My Fx 8350 A 4.7 with my GTX 780 on board draws about 390 W when running Prime blend. My 6700 K @ 4.6 with the same Gpu on board running prime draws around 100 watts less. Note that I was using 2 different power supplies when doing this test, my Fx 8350 has my Seasonic X 650 Gold unit and my 6700k was on my benching setup which uses a 1000w Evga Platinum unit. So the efficiency of the psu has to be taken into account as well. Though, the 6700k will definitely draw less power then the 8350.
 

This Digital foundry Video seems to offer some examples of faster RAM offering more than novel increases in Min FPS performance in some games. I wonder.
 
Problem is, AMD CPUs are not able to handle high frequency RAM like the high end Intel CPUs will. Weak IMC.
 
Back