• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Benchmarks vs. Overclocking

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

ClarkKent

Member
Joined
Jul 5, 2004
Location
Toronto, Canada
I've been doing some benchmarking while overclocking my Video card and ran into something I don't quite get. The following scores are for 3DMark01 for my 5200 non Ultra card.

Core/Memory
---------------
280/416 | 3515
275/416 | 4651
270/416 | 5549
265/416 | 5828
260/416 | 5795

So now I have some questions. Why do I get the best score when my card is 265 as apposed to being at 280? Does it just generate too many errors in 3DMark01 to get a good score? And do these benchmarks translate into the real world (like will games generate error and perform worse at the higher setting just as in the benchmark)?

I feel this a general video card question and not a Nvidia specific question that is why I have posted this here. Any justification for these results would be great. TIA.
 
Hit it on the head. :) As you increase your core, it is generating more errors which it fixes. This slows down the rest of the rendering process, lowering your 3DMarks. I'm not sure if it translates into real world apps, but a few hundred point difference in 3DMark is only a few frames/second at most. Not much to worry about.

JigPu
 
So would you suggest staying lower, where the score is good, or pushing the card further, where the scores drop down? And would additional cooling allow for a greater OC while being stable?
 
the higher score could very well be the cards "sweetspot," which is where the card will do the best that it can. has anyone ever done better with a clock that yielded worse benchmark scores? It's never worked that way for me.
 
Back