- Joined
- Jul 5, 2004
- Location
- Toronto, Canada
I've been doing some benchmarking while overclocking my Video card and ran into something I don't quite get. The following scores are for 3DMark01 for my 5200 non Ultra card.
Core/Memory
---------------
280/416 | 3515
275/416 | 4651
270/416 | 5549
265/416 | 5828
260/416 | 5795
So now I have some questions. Why do I get the best score when my card is 265 as apposed to being at 280? Does it just generate too many errors in 3DMark01 to get a good score? And do these benchmarks translate into the real world (like will games generate error and perform worse at the higher setting just as in the benchmark)?
I feel this a general video card question and not a Nvidia specific question that is why I have posted this here. Any justification for these results would be great. TIA.
Core/Memory
---------------
280/416 | 3515
275/416 | 4651
270/416 | 5549
265/416 | 5828
260/416 | 5795
So now I have some questions. Why do I get the best score when my card is 265 as apposed to being at 280? Does it just generate too many errors in 3DMark01 to get a good score? And do these benchmarks translate into the real world (like will games generate error and perform worse at the higher setting just as in the benchmark)?
I feel this a general video card question and not a Nvidia specific question that is why I have posted this here. Any justification for these results would be great. TIA.