• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

880GTX + CPU speed bottleneck

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

InThrees

Member
Joined
Feb 14, 2003
Location
Southeast US
Forgive me, but didn't the tests indicate that the cpu IS a bottleneck for that card?

I mean, put another way, between a core2 duo and an 8800gtx, the c2d is the weakest link in the chain. If the cpu wasn't a bottleneck, you would expect that at some point, overclocking the cpu wouldn't change the 3dmark results - but the machine used in the test couldn't clock the c2d high enough to find that point. Every increase in cpu horsepower netted an increase in 3dmark scores.

And of course I realize that a c2d, even stock, plus an 8800gtx is an almost ludicrously powerful combo, but my interpretation of the testing is that the 8800gtx is racing way ahead of the cpu, with 'more to give'.

It will be interesting to see these same tests repeated with a DX10 3dmark package. I wonder if the results will be the same, with the same scale.
 
Thanks for your feedback. But, here is a little more in depth answer to your questions. This really proves that a C2D is not bottlenecking an 8800GTX at high res gaming: http://www.ocforums.com/showthread.php?t=487151 Let me clarify that at low resolution gaming, the CPU bottlenecks the GPU a very minor amount. But at high resolution gaming, as you will read in my second review, the bottleneck is completely gone. I have tested my E6700 using FEAR at 2.66 ghz, 3.3 ghz, and 4.2 ghz.. all producing the exact same FPS.

Regards,
Dom
 
The realitivly low amount of bandwidth availible for PCIe slots is the bottleneck, not the CPU. 3dmark adds the CPU at the end, and that is why a faster CPU gets a better 3dmark.
 
Charr said:
The realitivly low amount of bandwidth availible for PCIe slots is the bottleneck, not the CPU. 3dmark adds the CPU at the end, and that is why a faster CPU gets a better 3dmark.

3dmark does not add the CPU score in the end, when I disabled it. Take a look at the 3dmark05 screenshots. CPU Score is not computed into the calculation. And you seem to be overlooking the scores that really matter in this test, FPS and SM2.0 and SM3.0 testing in 3dmark06. It shows a relatively small increase in score from 2.66GHz all the way to 4.2GHz. That displays only a small portion of framerate increase from CPU frequency increase. But, if you add the CPU score into your 3dmark06 test obviously total score at the end will increase substantially.

So, your saying an Athlon 64 at 1.8 GHz with an 8800GTX will perform the same as a Core 2 Duo at 2.66GHz because of a PCI-Express bandwith issue? That is just not correct and has been proven countless times already by comparing other OCForums members with 8800s and different CPU/Rig setups. I encourage you to do a tad more research in the Nvidia section on here and check out other similar setups to my own: http://www.ocforums.com/forumdisplay.php?f=86

If PCI-Express was saturated with bandwith you wouldnt see any frame increases even with overclocking the GPU. And there would be absolutely no headroom for improvement. We are not saturating PCIe16 with this card. As you can see in my FEAR benchmarks, when overclocking the card you can obtain rather substantial frame increases. That means, there is no way the PCIe bus is saturated because if it was you would be completed limited when overclocking the GPU(kind of like hitting a brick wall). Read this: http://www.ocforums.com/showthread.php?t=487151
 
Last edited:
My mistake, sorry.

I read up on all of your benching Dom, and all I have to say is that these cards are before their time.
 
Last edited:
Charr said:
My mistake, sorry.

I read up on all of your benching Dom, and all I have to say is that these cards are before their time.

No problem man. I appreciate the feedback. And you are absolutely right! We will need DirectX 10 to really show what these puppies can/or cant do. Maybe the newer GPU's from ATI will outshine using DX10, because this 8800 line was released pre-maturely? There is a lot of speculation on the DX10 architecture.

Again, I appreciate you taking the time to respond in this thread.

Dom
 
Back