• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FRONTPAGE AMD Ryzen 9 3900X and Ryzen 7 3700X CPU Review

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Does AMD Currently give you this option with their Desktop CPU
Yes. I see settings for cores to enable/disable and SMT. I assume it was like this in previous AMD generations as well?







Someone was mentioning they would like to see testing about SMT and performance, etc:
 
Stability has been fine on modern Intel, for me, up to 90C. It is after that things get wonky (in my testing and for my uses). 80C leaves a bunch of meat on the bone...another 100Mhz or so, but if its erroring out for your uses, it is. But you are deep down the rabbit hole with how you use your machine... it seemingly needs more stability for your number-crunching versus how many use a PC here for general use, encoding, rendering, and gaming.

If running stock I can run all the way up to thermal throttling. It is only when overclocked the temperature seems to play more of a role. I think it is the start of thermal runaway. Hotter means less stable. Less stable means more volts needed. More volts needed means hotter. Not a good feedback cycle.

Is it too much to ask for Prime95 small FFT stability at the highest AVX level? :D My 7800X went above 100C without throttling running AVX-512 code at stock, with a big Noctua on it. That's some serious heating...

The 8700k & 9900k allowed you to turn off 1, 2, 3 or more Cores & HT to get to a higher clocks/lower temps.
The i9-9820x (HEDT) with a Evga Dark MB allows you to turn off individual Cores (turn off the hottest ones) to get the highest OC.
Does AMD Currently give you this option with their Desktop CPU or is this limited to their HEDT line???

In Ryzen Master that seems to be a thing. It also indicates to you what AMD thinks are the fastest and 2nd fastest core of each CCX. I haven't tried playing with that function yet.
 
Posted this in the PBO thread, but likely better to discuss here. :)

Silicon Lottery results...

3900x max = 4.2 (@ 1.25v)
3800x max = 4.3 (@ 1.3v)
3700x max = 4.15 (@ 1.262)

I think they went a bit low on the voltage, but... pretty telling...

I wonder how SL is binning/testing these chips? I'm assuming they are running fixed vcore/multi; LLC enabled and with BPO disabled using various stress tests?

Silicon Lottery 3900X: Bottom bin... [email protected] < 40.5x @1.212v < 41x @1.225v < 41.5x @1.237v < Top bin... 42x @1.250v.



Has anyone seen or heard from AMD what their maximum recommended voltage and safe max temps are for these CPU's?...

TLDR: What's a safe target 24/7 Core voltage and temps?

I've been wondering that myself... When I first booted up my 3900x... I saw BIOS vcore at ~1.5v! which I thought might be a BIOS issue or that maybe I was going to have a high VID "dud" chip.

Auto BPO shows as high as ~1.495v via my boards voltage read points while idling in Windows and it is constantly bouncing around going as low as ~1.1(+)v.

With PBO and AIDA stress the 12 cores bounce around between ~4125Mhz and ~4200Mhz averaging ~4175MHz with vcore at ~1.375v.

I can crunch 24 threads of Rosetta@home at 4.2Ghz (all core) with a fixed ~1.25v vcore and might possibly do it with even less voltage.


***EDIT***

I did some quick testing with Silicon Lottery bin settings...

Okay stability on the top bin (3900X) using some other stress loads... But Prime 95v298b5 small fft avx enabled would crash out/black screen.

Dropped down to the 41.5x @1.237v with Prime 95v298b5 non avx seemed okay... partial avx enabled maybe?... But small fft full avx enabled crashed out of the system.

Dropped down again to 41x @1.225v and Prime 95v298b5 small fft full avx enabled seems to run okay. Full load vcore was reading ~1.195v via voltage read points. I might be able to tweak LLC settings and get the next higher bin stable??

That small fft fully avx enabled is a hard stress tester!

***EDIT #2***

Went back and tested 41.5x @1.2375v in BIOS with middle LLC setting enabled (3/6) and was able to run Prime small FFT full avx:

Ryzen 9 3900X @4.15GHz 3200C16 1.2375v BIOS middle LLC setting 3-6 Prime small fft full avx enab.PNG
 
Last edited:
The 8700k & 9900k allowed you to turn off 1, 2, 3 or more Cores & HT to get to a higher clocks/lower temps.
The i9-9820x (HEDT) with a Evga Dark MB allows you to turn off individual Cores (turn off the hottest ones) to get the highest OC.
Does AMD Currently give you this option with their Desktop CPU or is this limited to their HEDT line???

Thank You

I think ryzen master allows this. I haven't messed around with it but I think you can disable a core or an entire ccx.
 
Yes. I see settings for cores to enable/disable and SMT. I assume it was like this in previous AMD generations as well?







Someone was mentioning they would like to see testing about SMT and performance, etc:

I would be one that like to see performance and OC with SMT off.
Since you and I had the discussion that 24 threads are not needed as a conclusion to that particular conversation and I do agree with, I take much interest with SMT off for a couple of different reasons.
One reason was seeing a 2700x, for example, able to run max boost clocks of 4350mhz with SMT off while on or enabled, the cpu was difficult to stabilize at anything over 4.1-4.2Ghz.

I generally run with SMT off getting ultimately better temps and higher OC.

Sent from my LG stylo 4
 
The 8700k & 9900k allowed you to turn off 1, 2, 3 or more Cores & HT to get to a higher clocks/lower temps.
The i9-9820x (HEDT) with a Evga Dark MB allows you to turn off individual Cores (turn off the hottest ones) to get the highest OC.
Does AMD Currently give you this option with their Desktop CPU or is this limited to their HEDT line???

Thank You

I have used Ryzen Master to disable individual cores and an entire CCX from an 8c/16t to a 4c8t.
 
Sweet! Finally a new BIOS for my board (ASUS ROG Strix B450i Gaming). Hopefully I'll finally be able to apply an OC to my CPU and RAM.

So just an update in case anyone has a ASUS ROG Strix B450i Gaming, B450-F Gaming, or X470i Gaming, the 2501 BIOS is a good BIOS. This BIOS corrects the USB driver failure during boot that plagued the 2604 version and the lack of overclocking/memory control that was in the earlier 2301 BIOS.

I'm now trying to find out my max stable, though I'm not expecting much over PBO speeds.
 
I was down for a day. Video on the Aorus 1080ti went out. Gonna see if it's just a port and if anything else works of of I can fix it at work. If not I'm already back up and running with a MSI gaming x trio 2080. Just gotta figure out what bios to use on that. Ran a 3dmarl with the 3900x and that CPU score is very impressive. AMD did a nice job with the Ryzen 3XXX. Seen more i9 price drops now. This is great for us and things can only get better when they drop there higher end Navi cards. I like the direction they are going and can't wait to see the Ryzen refresh or next gen Ryzen.
 
Any thoughts that the 3600x will bottle neck a 2080ti at 4k? This is an area I haven't researched too much and I'm not familiar how to determine these kind of constraints.
 
Any thoughts that the 3600x will bottle neck a 2080ti at 4k? This is an area I haven't researched too much and I'm not familiar how to determine these kind of constraints.

Nah should be good to go with that 3600x. The 3600x is not that far off from a 8700k
 
Common thought on this is that at 4K the CPU makes less of a difference right? Don't the benchmarks bear this out?
 
If you have a 4k60 display then I feel it safe to say there wont be a problem. If you have high fps, then it may vary depending on the title/settings but chances are the GPU will still be the limiting factor.
 
Hmm yeah I've started to pay attention more to CPU performance at the higher resolution. I can see how the CPU doesn't bottle neck the GPU, but it does look like it will become an issue with RT enabled games. I'm thinking of just starting off with a 3600x and working upwards if need be.
 
does look like it will become an issue with RT enabled games.
What makes you say this?

In the current state, only Nvidia GPUs are capable of RT in games. If you have an AMD card, it doesn't use the CPU. I don't recall hearing anything about raised CPU use when enabling it either. IIRC, the new consoles will have hardware RT in it with the new GPUs(????). AMD's next gen is rumored to be going that way as well.
 
ED, prelim data shows that BF5 has increased per-core utilization when it comes to RT. The number of comprehensive tests with RT is scarce to find and may require personal testing to validate these answers.

My next build is targeted for Cyberpunk 2077 which is coming out (hopefully) in April. There won't be any better cards out for quite some time, or at least rumor has it. So I'm looking to pair a well rounded CPU with a 2080Ti and try to anticipate the the work load utilization between the CPU and GPU. Difficult, stupid, and above all moronic, but whats a computer enthusiast to do for 6+ months?
 
Seeing as everyone else seems to be suggesting a 3600X is enough I figured I would play devil's advocate and try to make the case why you should go for the 3700X. My case consists of four points:

  1. Being CPU limited is not really a property of a game overall, but rather of individual frames. You can have some frames being CPU limited while other's are GPU limited. You can see this in some 4k benchmarks between CPUs where you would think the game is completely GPU limited but you still see some small performance scaling with better CPUs. This is because some percentage of the frames in the benchmark are still CPU limited and that percentage then gets turned into a small fps difference when you average. The question is if it is the case of the occasional single dropped frame here and there, in which case it might not be noticeable, or if it is a cluster of dropped frames in some specific section of the game/benchmark, in which case it will be very noticeable. The typical ways of summarizing benchmarks with avg and 1% min doesn't really tell you which case it is.
  2. Benchmarks never really look at multiplayer gaming because its not repeatable so you can't get comparable data. The problem with this is that multiplayer means adding more CPU load from having to deal with network code, This means multiplayer is always more CPU limited than benchmarks. How much this matters depends of course on what sorts of games you play. The more players and the more actions that need to be communicated, the more CPU load there will be. I myself ran into this same issue five years ago which forced me to upgrade from my trusty 2500k, because it just couldn't keep up. I was playing Planetside 2 at the time, a MMOFPS, where you could occasionally get these massive battles with hundreds of players fighting in a single base. In smaller battles I would be GPU limited at > 60 fps, but when these large battles happened 2500k would choke and I would drop down to ~25 fps. An online shooter at 25 fps means having a bad time. The CPU upgrade meant I staid solidly above 60 fps in even the biggest battles.
  3. Games usually offer a plethora of graphics options you can tweak if your GPU isn't hitting the kinds of fps you are looking for in the game. For CPU its much worse. Many games don't offer anything that lets you improve performance if you are CPU limited and the only thing you can do is a hardware upgrade.
  4. The previous points were all general points about not underestimating the importance of CPU perf for gaming. This point is about why I would recommend the 3700X over the 3600X specifically. The next gen consoles from both Sony and Microsoft (with release dates in 2020) are both built around 8 core Zen 2 SoCs. This means most upcoming games from next year forward will start optimizing around a 8 core / 16 thread Zen 2 architecture. So an 8 core Zen 2 CPU will probably be a real sweet spot for gaming for many years to come.
 
Does any game utilize 6 or more CPU cores? If it doesn’t utilize all 8 or 12 cores, why step up?

For Cyberpunk 2077 interested folks:
The specs on the computer they played the E3 2019 demo:
CPU: Intel i7-8700K @ 3.70 GHz
Motherboard: ASUS ROG STRIX Z370-I GAMING
RAM: G.Skill Ripjaws V, 2x16GB, 3000MHz, CL15
GPU: Titan RTX
SSD: Samsung 960 Pro 512 GB M.2 PCIe
PSU: Corsair SF600 600W
 
Back