• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

3DMark03 Question

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

MetalStorm

Member
Joined
Jan 23, 2003
Location
England
Okay, I have a ti4600, and at stock speeds I get 1700 3dmarks using the computer in my sig... What I cant work out though is WHY the radeons, especially the higher ones score so much more, i mean at stock they can get ~4000 !!!
Yeah, I know that because the GF4's cant run the DX9 test, then it doesnt get the points for it, so lets say it gets 2000 in total...

From my perspective it looks like 3dmark03 is totally biased towards ATI cards!!! As the 9700pro only scores ~10% ish more than a ti4600 when they are both at stock speeds? but 03 is a completly different ball game the 9700pro scores over 100% more, AND the "games" dont even look particularly good (im not refering to their low FPS but just looking in terms of polygons and general niceness) Just watching it go, for example the opening scene on the "trolls layer" test I get 15-16 FPS ... WTF! its not very complex at all, I expect that sort of quality to run at hundreds of FPS not 15.

I also noticed I only get 800 Mega texels on the single fill rate as well... where as I used to (and certainly should) get 1000 ish. I cant remember what I was getting for the multi texturing but either way, it looks like they have just reduced the speed nVidia cards run at by 20+%

Does anyone have good reasons for the above? or did ATI slip a few notes in to futuremarks pocket?
 
It's a new test with never techniques, if there there wouldn't be anything new with it.. we'd be all using some old test still.

The reason you get lower score is that you have an older card compared to the Ati cards.. of course newer products will score better in a test that also is newer. I would be surprised if they wouldn't.
 
its not biased

Allthough it may seem like it favors ATI it doesn't its actually very fair in the way its written. IF you have the time go to tomshardware site and check out the article they have on this exact subject. The truth makes alot more sense. IT breaks down like this, your Ti-4600 card is a fast card but technically only a directX8 card. the radeon 8500,9700,9800 cards are direct X 9 cards and the funny part is 1 test makes all the difference in the score. Pixel shaders are the problem. The GF4 cards only do pixel shader 1.1 while the 8500 will do 1.4 and the 9700/9800 will do 2.0 and this makes a huge difference in score, do any games use this technology yet, not really so its more of test of things to come than a current games test. Also 3dmark2k3 uses non-optimized code, ATI is better at rendering non-optimized code than Nvidia. This may seem like a fair comparison, but most game developers optimize their code for certain cards so in real life it doesn't matter. If you want a true comparrison compare framerates under similar conditions in a newer game like UT2k3, you will see that your card isn't so bad. in fact I've always thought that NVida cards were better at OpenGL than directX (big quake fan) try running UT2k3 in opengl and it will look better and perform great on your TI card.
 
Re: its not biased

supergenius74 said:
the radeon 8500,9700,9800 cards are direct X 9 cards

Radeon 8500 is a Directx 9.0 card? I thought it's directx8.1?
 
Last edited:
Re: its not biased

supergenius74 said:
Allthough it may seem like it favors ATI it doesn't its actually very fair in the way its written. IF you have the time go to tomshardware site and check out the article they have on this exact subject. The truth makes alot more sense. IT breaks down like this, your Ti-4600 card is a fast card but technically only a directX8 card. the radeon 8500,9700,9800 cards are direct X 9 cards and the funny part is 1 test makes all the difference in the score. Pixel shaders are the problem. The GF4 cards only do pixel shader 1.1 while the 8500 will do 1.4 and the 9700/9800 will do 2.0 and this makes a huge difference in score, do any games use this technology yet, not really so its more of test of things to come than a current games test. Also 3dmark2k3 uses non-optimized code, ATI is better at rendering non-optimized code than Nvidia. This may seem like a fair comparison, but most game developers optimize their code for certain cards so in real life it doesn't matter. If you want a true comparrison compare framerates under similar conditions in a newer game like UT2k3, you will see that your card isn't so bad. in fact I've always thought that NVida cards were better at OpenGL than directX (big quake fan) try running UT2k3 in opengl and it will look better and perform great on your TI card.

I think the problem with the pixel shaders is that the Ti series supports PS 1.3. However, the 3dmark03 tests check for 1.4 compatibility first, and if that's a no-go, then they revert to 1.1. 1.1 is far slower than 1.3, and so you get worse performance on a generally faster card.

Why didn't they code for 1.3? Who knows.
 
The bottom line is this is a test for 9500-9800/FX generation or newer cards. Same as 3dmark2001 didn't compare a gf2 ultra and a gf3 correctly.
 
Back