• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Analyze the relationship between shaders, raw clock frequency and the CPU bottleneck.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Super Nade

† SU(3) Moderator  †
Joined
Aug 30, 2004
Location
Santa Barbara, CA
Analyze the relationship between shaders, raw clock frequency and the CPU bottleneck.

Folks,

I happened to discuss 3DM scores with somebody over at DFI street and I came across this peculiar behaviour.


3DM05


compare8jv.jpg


No ambiguity here. I score a bit more in all the GPU tests. Both of us didn't run CPU tests, so there is no mystery here. That is why my score is higher.
It could be higher despite being 8 pipes short, because a) My clocks are way higher than his b)My CPU does not bottleneck the SLI system as much as his.



3DM06


compare8hz.jpg


He beat me in 3 tests out of four. Yet, my score is higher than his becuase of the CPU. I have an Opteron 165 @ 2700MHz with 2Mb of ondie cache. Since 3DM06 can (?) take advantage of multi threaded and multi core CPU's, I score double of what he does. Since there is no way to disable CPU tests (as the two GPU tests follow the CPU tests), I cannot say what will happen to the scores. In this case, I'd say those 8 extra pipes he has are proving themselves.

Now, don't you see the discrepancy between 05 and 06? An interesting question would be to see how the pipe/MHz ratio has changed between 3DM 05 and 06. I mean to ask the question, "How many MHz is equivalent to unlocking 1 pipe" ? Clearly, since our calculations are not based on a fixed standard but with respect to a dynamic quantity, i.e 3DM scores, which is heavily dependent on the particular test algorithm used, so, there is no clear cut answer.

Now, is it right to assume that 3DM05 was more CPU dependent than 3DM06 (or) 3D05 had an implicit CPU dependence whereas 06 oes not? How do you go about seperating GPU Frequency effects from CPU Freq effects and number of active pixle pipelines? Is there any experiment which can isolate these three?

I still can't understand how he scored more than me in 06 graphics tests?

Ofcourse, the most simple reason would be that I had a bad run. But lets not look at the obvious. :D

Maybe 3DM06 favours rendering ability over raw speed? I'm not sure what to say.
 
I rarely visit in here, but,

How many times did you run your test?

Did you both run 1 test?

Maybe run an avarage of 3 & see how you fare?

just out of curiosity.

:attn:
 
Sure. I'll try that when I get hold of the dude. Nevertheless, it seems that things have changed with 06. :)
 
What setup exactly are you comparing against here? I don't think re-runs are necessary, 03/05/06 are usually very, very consistent, as far as tenth of a percent or even less variation I'd say.
 
There is no doubt that 05 game test calculations included in a score are affected by CPU Mhz and dual core drivers. Maybe in 06 CPU power is seperated into it's own test(s) and doesn't factor into the graphic tests at all. A simple way to test would be running 06 with different CPU Mhz and see what happens to the graphic tests.

What card clocks were you running? It looks like he ran almost 100Mhz less GPU on '06
 
Last edited:
Maxi,
I'll try that and send out the results. I'm running 6800GS SLI and he is running 6800GT SLI. But, how do shaders and pipes tie in with GPU MHz. If we can think up a test which will isolate peipe effects from CPU MHz, that would be nice. I understand that in case of GPU's pixle pipelines are not the same as CPU instruction pipelines. Thus I'm a bit confused about the exact nature of the relationship and if they are even seperable.
 
Hehe check this out. If 3DM06 runs on only one core I lose about 1000 pts. :)
Not surprisingly, the CPU score is halved. More surprisingly the SM 3.0 and 2,0 scores are lower too! Am I right in concluding that 3DM06 isn't truly CPU independent even with GPUtests? Or is this a problem with the OS?

cpuisolationtests1rq.jpg
 
The shader tests that '06 uses are definatly CPU biased. I haven't done any tough analysis to determine the extent, but with my slow CPU its kinda hard to miss :D

I've been running '06 over and over for the past few hours to determine where the stability edge on it is. In the process, I've been keeping track of the scores as various things are tweaked.


FSB.. / GPU / MEM == 3DM / PS2 / CPU
147.. / 395 / 320 == 696 / 339 / 596
147.. / 398 / 315 == 696 / 339 / 593
147.. / 400 / 317 == 698 / 341 / 593
148.5 / 405 / 317 == 707 / 345 / 603
149.5 / 405 / 317 == 714 / 348 / 603


The last two are the best example since everything but FSB is held constant, but really you can see the result with every bump. As you can see on my various 147MHz runs, bumping the GPU by 5MHz basically gave me 2 PS2.0 marks. The jump to 148.5MHz (along with another 5MHz GPU increase) nets 4 marks -- 2 likely from the GPU, leaving 2 from the FSB. Going to 149.5 gives me 3 more marks, though since the GPU was held constant, the result is from the FSB increase alone.

I don't know if it's more or less CPU sensitive than 03 was, but a faster one certianly will increase your Pixel Shader marks somewhat.

JigPu
 
Hey Jig, thanks for the results. I think CPU influence will not show up in the scores, once you crank up that AA and AF. To do that, you have to buy 3DM 06. :)

Can you explain my core affinity results similarly? I think this may be a bit different.
 
Back