Well, I recently upgraded my PSU and my video card from a Radeon 9800 Pro, and I ran them through 3DMark05 to compare to my old scores. I did end up jumping from scores in the 2700/2800 range, to 5100 range, but I noticed something that got me concerned.
When I was looking at all my comparison scores, I noticed that everyone else's scores with "similar configurations" listed their Memory speeds anywhere between 1200 and 1500, whereas my memory speed consistenly showed "627" mhz. The box specifically states that the memory clock is set at 1250 mhz (vs 1200 mhz standard).
I initially just shrugged it off as just showing base memory speed, meaning my effective speed was actually 1254 mhz or so, but I wondered why it was showing it differently for the other people's comparison scores on the 3DMark05 site.
I got to reading and installed both the RivaTuner and ATI tool and now I'm really concerned.
Both of those consistently show that my speeds for 3D performance is set (Overclocked by BFG, it appears) at: 400mhz Core and 625 mhz memory.
So am I supposed to see the overclocking utilities and/or 3DMark site list my memory clock at the 1200'ish mark, or am I going to see it list the 625 mhz and I just have to mentally double it, knowing it's actually doubled what's listed? If so, why is 3DMark05 listing the "effective" clock speeds for everyone else, while showing mine at the actual clock speed?
I am pretty sure I saw someone post a screenshot of them using the ATI tool and it actually showed their memory clocks listed at the "doubled" rate in the adjustment sliders on the utility - am I imagining things? I haven't been able to find that post again.
Anyway - I'm kind of going crazy wondering if I got a gimped down card, or whether I'm over-reacting - any help would be appreciated.
Thanks!
When I was looking at all my comparison scores, I noticed that everyone else's scores with "similar configurations" listed their Memory speeds anywhere between 1200 and 1500, whereas my memory speed consistenly showed "627" mhz. The box specifically states that the memory clock is set at 1250 mhz (vs 1200 mhz standard).
I initially just shrugged it off as just showing base memory speed, meaning my effective speed was actually 1254 mhz or so, but I wondered why it was showing it differently for the other people's comparison scores on the 3DMark05 site.
I got to reading and installed both the RivaTuner and ATI tool and now I'm really concerned.
Both of those consistently show that my speeds for 3D performance is set (Overclocked by BFG, it appears) at: 400mhz Core and 625 mhz memory.
So am I supposed to see the overclocking utilities and/or 3DMark site list my memory clock at the 1200'ish mark, or am I going to see it list the 625 mhz and I just have to mentally double it, knowing it's actually doubled what's listed? If so, why is 3DMark05 listing the "effective" clock speeds for everyone else, while showing mine at the actual clock speed?
I am pretty sure I saw someone post a screenshot of them using the ATI tool and it actually showed their memory clocks listed at the "doubled" rate in the adjustment sliders on the utility - am I imagining things? I haven't been able to find that post again.
Anyway - I'm kind of going crazy wondering if I got a gimped down card, or whether I'm over-reacting - any help would be appreciated.
Thanks!