• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Bottleneck interpretation

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

67Elco

Member
Joined
Jan 31, 2024
Location
Gulf Breeze, FL
I just swapped out a 8700G cpu in a rig that may be sold. I replaced the cpu with a 7700X to up the performance a tad. The original gpu was a RX 5700 XT, but it would struggle in some games. Replaced the gpu with a RX 6800 XT and it performs light years better. My question is about interpreting data shown in the COD 6 benchmark. It was run in the Extreme preset...can anyone tell from the screenshot if there are any evident problems? Probably selling this rig and don't want any problems for the customer. Thanks for any help in advance.

COD 6 Extreme discussion.jpg
 
Last edited:
Hard to compare when we don't see the 8700g.

But it looks like it's saying with a faster processor you could gain something at that res. How it measures, no clue. But id agree... im sure there's a bit left on the table considering there are faster cpus out. I'd guess that 93% value goes up when you raise the res. ;)

Short answer...it is what it is....don't overcomplicate things. Sell it. 😀
 
No, what I am trying to say is I don't know what those "bottleneck" or other figures mean. Or are they meaningless without something to compare to? The rig now benchmarks and games like a beast with the new components compared to the 8700G/5700XT that were previously in place. I have never spent much time messing with gpu's, overclocking and such and likely never will. I simply have just always bought a card for gaming that I thought was reasonably assured to be powerful enough for the current games. In other words, I am clueless about the vernacular lol.

I'm guessing all is well regardless. I just repeated the benchmark on my dedicated gamer (7800X3D/7900 XT) in both 1080 and 1440 resolutions and in general it acts the same. My conclusion is the 7700x/6800XT combo is no slug for gaming to the best of my knowledge.

COD 6 7800X3D-7900XT Extreme 1920x1080.jpg COD 6 1080.jpg
 
Last edited:
The benchmark appears to be showing you how much of the time where either the CPU is the limiting factor, or the GPU is. Other benchmarks call it "CPU limited" or "GPU limited". If you increase the performance of whatever is the bigger limit you should see gains. As you show, the balance is affected by settings as well as the hardware. At lower resolutions where you can attain higher fps, the CPU plays a bigger part. At higher resolutions/settings the GPU tends to do more so CPU matters less.
 
All APUs are significantly slower due to their low cache and other limitations. If they are +/- 10-20% slower in CPU benchmarks than the regular Ryzen 7000, then expect them also to affect games, especially at lower display resolutions.
The 8000 APUs are a mistake, and no matter how I try, I find them pointless for anything but memory overclocking (like pure clock as the performance is low). I have the 8700G, and it's worse in games or 3D benchmarks than the 7600 (not even X).

In the screenshots, a lower % means the CPU/GPU causes less bottlenecking. So at 1080p, it shows that the FPS is bottlenecked more by the CPU, at 1440p more by GPU. This is expected as mackerel already mentioned.
 
Last edited:
That's what I get by looking at the images through a phone, lol. I didn't see there was one for the CPU and GPU!

Yeah, it's simply showing where the bottleneck lay for those settings. 1080p is notoriously CPU-bound so you'd typically see some sort of CPU limits there, especially with faster cards. As you increased resolution, the 'bottleneck' moves to the GPU (which is where you want it to be, ideally). If you dropped in a X3D chip from 7/9 series or even a 9 series CPU, the CPU bottleneck will likely shrink at 1080p as it can output more of what the GPU can send it. ;)
 
No testing on my part...just seat of the pants and what I actually see during gaming. I've come to the conclusion I honestly can't tell any difference in gaming experience between a X3D cpu and a normal X version when using the same or near equal gpu...I just can't. From my latest experiences I have to say I would be just as happy with a 7700x or 9900X vs the 7800X3D currently in my game rig. If I did not see any numbers flashing in the corner of the screen from Rivatuner I would not know which cpu I was using at the moment. I'm thinking why not just adopt a more versatile cpu such as the 9900X for less money. If trying to stretch every dollar and strictly for gaming the 7700X would be difficult to beat.
 
Last edited:
I'm thinking why not just adopt a more versatile cpu such as the 9900X for less money.
That's what I'm doing. I couldn't bring myself to use a 9800X3D because of core/thread count (from a 13900K), though I'm sure with my uses it would be plenty fine.

I'm also jumping up to 4K/240 and the CPU matters even less. 9900X is on my list, too... Hopefully this all goes down in April. But there are those who are looking for every FPS possible, regardless if they can see it on screen.... or if its primary use is gaming, why not get he CPU that does the best for what they are doing?

EDIT: Some benchmarking for games at TPU... 9950X is ~9% behind at 1080p...4% at 4k
 
Last edited:
That's what I'm doing. I couldn't bring myself to use a 9800X3D because of core/thread count (from a 13900K), though I'm sure with my uses it would be plenty fine.

I'm also jumping up to 4K/240 and the CPU matters even less. 9900X is on my list, too... Hopefully this all goes down in April. But there are those who are looking for every FPS possible, regardless if they can see it on screen.... or if its primary use is gaming, why not get he CPU that does the best for what they are doing?

I might could agree if we were discussing competitive gamers, but for the remaining 99.9% of us it just does not matter. Like I said, my eyes just can't see the difference. Then again my eyes are 72 years old, so there is that lol. :LOL: Unlike gpu's, cpu prices tend to fall over time. Betting a 9900X will be a even bigger bargain in the future.
 
I might could agree if we were discussing competitive gamers, but for the remaining 99.9% of us it just does not matter. Like I said, my eyes just can't see the difference. Then again my eyes are 72 years old, so there is that lol. :LOL:
You're not wrong. But I'd say it matters to more people than you think or these 'gaming' CPUs probably wouldn't exist and be the breadwinner for AMD desktop CPUs..........regardless if they can 'see' the difference. But hell, 10% is 10%... and why not get the gaming CPU for a gaming computer if it's in the budget?
 
You're not wrong. But I'd say it matters to more people than you think or these 'gaming' CPUs probably wouldn't exist and be the breadwinner for AMD desktop CPUs..........regardless if they can 'see' the difference. But hell, 10% is 10%... and why not get the gaming CPU for a gaming computer if it's in the budget?

Yeah, I agree it's a marketing ploy to increase profits and I do agree that many just can't stand not having the ultimate at any given moment in time. I've been guilty of being swayed by these things as well...guess aging allows practical decision making to come to the forefront lol.
 
Well, I wouldn't go that far (marketing ploy), lol. That's not what I was getting at.... more so trying to temper the belief that 1 of 1,000 people don't need these and wouldn't notice the difference. ;)

There are plenty of good reasons to get a X3D over a non X3D, even for a non-competitive gamer. In the link I sent, 10% isn't insignificant and is a tier of GPU. It could be the difference setting Medium to Ultra or reaching your FPS goal or not. Look here at 1080p with a 9800X3D vs 7700X... ~25 FPS difference...20 FPS difference over a 97/9800X... that's significant, especially
 
That is where we differ...I can't see on screen in game a 20 fps increase as it pertains to quality. ;) Numbers don't lie, but if the eye can't tell the difference then it becomes pointless. 200 fps vs 225 fps...phttt.
 
That is where we differ...I can't see on screen in game a 20 fps increase as it pertains to quality. ;) Numbers don't lie, but if the eye can't tell the difference then it becomes pointless. 200 fps vs 225 fps...phttt.
Maybe you can't. But there's more to it than that. If those things (some of which I said in the previous post) aren't worth it, then they aren't. But to many, MANY more than 1 in 1000, it's worth it and far from a marketing ploy. ;)
 
Question - Can you tell/see a difference between 20 fps in either direction on screen/in game without a fps counter showing? I'm talking about a game that one would normally see/expect fps in the 150-200 range.
 
Last edited:
So, I limit my FPS to 165 (I have a 165 Hz monitor). I can notice when it drops below that, yes. I'd say 20 is around the threshold in most FPS games for me (COD/Fortnite/PUBG/BF). But again, I'm not hanging on that metric like it's The Gospel. Additional 10% allows you to do the more tangible/noticeable things I've mentioned as well if you don't need the additional FPS but want better eye candy.
 
Just for giggles I installed the RX 6800 XT in the daily with the 8700G and ran the benchmark at both resolutions. In general the 7700X was a great improvement, but the 8700G proved to be more than capable. Just as before on Extreme preset. There's that damn white space again lol.

COD 6 1080.jpg COD 6 1440.jpg
 
Last edited:
The whitespace....it's........blinding!! :rofl:

More than 0.01% want more than 'capable' is my counter point. That's all. :)
 
Just ran it on the 780M cpu graphics. Not really playable lol. Finally figured out how to close up that white space. :LOL:

cpu graphics.jpg
 
Last edited:
Back