• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Advice for upgrading to FX 8350

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
So I did some reading on the games you play Witcher it seems as if most of the ones you mentioned aren't optimized for more then 4 cores. Crysis 3 is the only one I saw a significant gain at stock clocks with the 8350 outperforming the 965 by about 17 fps on average. You may want to try overclocking your ram to 1600 with 9.9.9.24 timings and raising the Nb frequency to see if it helps at all on the 965.
 
that's kind of where i was heading, and rasing the clock up a little to see the effect.
 
Ram is oc-ed to around 1440mhz, cpuz says 705,7 mhz. I do not think that i can overclock my naked ram that high, 1600, without damaging it.
 
g7yp.jpg
 
lets leave that kingston ram where it is, but change the command rate to 2t.

lets raise the clock 100mgz.

lets raise the ht up to match the cpu/nb and give it a try
 
Only way to raise ht is by raising fsb, when I tried to raies fsb more then it is now oc is becoming unstable. If we descard ht then you want from me to raies cpu clock to 4.1 ghz and to change ram from 1t to 2t, if so then what to do after that?
 
do that and play the game and report what happened, if it's better we may need to go with the 6300 and invest in a little cooling.
 
what's bigger than fps is if the game play smooth and nice, turn off fraps or what ever it is and see if you like how the rig plays, it's not about fps, it's about our gamming pleasure.
 
what's bigger than fps is if the game play smooth and nice, turn off fraps or what ever it is and see if you like how the rig plays, it's not about fps, it's about our gamming pleasure.

I did what you told me and tested it but I didn't noticed any real difference in same area, maybe I got 2-3 fps increase but I cannot be 100% sure because fps fluctuates in that area.

Its an issue for me and I notice a difference while playing when fps drops from 60 to 30 which is not a smooth gameplay for me and its a even more irritating when I know that my gpu is more than capable to max out mentioned games with constant 60 fps
 
To be honest the oldest component in my current rig is ram memory. Everything else is replaced and upgraded only that ram is here from the first time I bought new rig which was almost 5 years ago so I think that also it deserves to get retired in the near future.

Regarding cpu, I will think a little bit about that two cpu and see which is best for me and my current rig but its obviously that they are both better cpu than my current cpu.
 
it's not the ram.
whats the cpu/nb speed?

Main timings mean little of the subtimings are maxed out.


I suspect that maybe your RSY PSU is not providing the power it needs.

Jonny guru rates it "below mediocre."

However power supply issues is not always indicative of the power supply not being able to push the power (although it usually is), but if it is not capable of pulling the power from the wall you can still have issues even without tripping the breaker or blowing a fuse.

I do not know where you live, but even in my area power if you live in an old house you can have issues.

First try moving your PC to a dedicated breaker. If this absolves the problem figure out what else is on the circuit you usually use and remove the high draw devices. I recommend this because it is the free test.

the next test is getting a kill-a-watt or some other load draw (nothing fancy just something you can watch the load on) and check you peak wattage when frame rate drops. I bet you are pushing over 450W. Which for a good PSU is nothing. But that one.... it is only rated for 480W on the 12v line, and that is in ideal manufacturer conditions.

You obviously have some type of monitoring software running. This increases overhead, but add on CPU monitoring too it. Does CPU performance hit 100% when the GPU frame rates drop? that is indicative of CPU bottlenecking in conjunction with a drop in frame rates. (i assume it always the same spot of the game?)

I agree that 1600 MHz RAM is the optimal for the 965BE higher if you go BD, but the difference is still minimal and nothing you would actually notice in performance, only in framerates that are monitored.

Lastly, the BD cores are poorer performers per core than the phenom II and thubans. If anything I would say get a used thuban (phenom II x6 cpu) or a zosma, but I personally would not use a hyper 212 on anything more than an ath2 dual core. Oh wait I am :)

the problem is not in the cooling ability but their low maximum load capability. once you hit it the delta increases way too fast.


However all this is moot if you are decrypting BDROMS or running HD video on a second monitor while you are gaming, in that case, yes. MOAR CORES!!!

I been playing through MW3 on my AIWHD (3650) and an ath2 270 (stock because of a cheap motherboard that cant OC past 205 HTT), and I am HDD limited. How do I know? well my Vrap is 90% full and I only get stuttering at loading screens :)

EDIT: Fixed spelling but one more thing. I mentioned the second monitor and watching videos because that is what I do while playing CPU demanding games like civ5 or total war. I currently can not do to my "below mediocre" setup lol
 
Last edited:
+1 neuromancer, fx means a better 750 watt powersupply and a big overclock.
all you will ever gain from memory is 1%-5%.
 
I went from an Fx4100 to Fx8350 with an HD7950

in all the big games I had to drop gfx settings, always had my 4core 80-100% usage on BF3, crisis3 and farcry3

my graphics card was around 60-80% usage.

went to my fx8350 and now gfx is 99% and cpu around 70%, all games I can play on max settings with about 60-70+ FPS (BF4 struggles a bit tbh) but the majority of games did see a 25-50% increase.

problems are that I literally was forced into watercooling it, luckily I had a decent psu/mobo/ram

to overclock this and getting it any good at games, well like the others said youd need to spend the cash.

like the others said, an equivalent intel build would be better, all the benchmarks/experienced users show this.


(I also xfired it, and took the 2nd card out as its absolutely pointless and I didn't see any massive improvements to outweigh the costs)
 
I went from an Fx4100 to Fx8350 with an HD7950

in all the big games I had to drop gfx settings, always had my 4core 80-100% usage on BF3, crisis3 and farcry3

my graphics card was around 60-80% usage.

went to my fx8350 and now gfx is 99% and cpu around 70%, all games I can play on max settings with about 60-70+ FPS (BF4 struggles a bit tbh) but the majority of games did see a 25-50% increase.

problems are that I literally was forced into watercooling it, luckily I had a decent psu/mobo/ram

to overclock this and getting it any good at games, well like the others said youd need to spend the cash.

like the others said, an equivalent intel build would be better, all the benchmarks/experienced users show this.


(I also xfired it, and took the 2nd card out as its absolutely pointless and I didn't see any massive improvements to outweigh the costs)

Freeken spot on post. No BS. Straight speak.

That heat thing you mention is exactly why I suggested the FX-6300 because you are not lying about HOT...using 8 cores.

RGone...ster.
 
Back