• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Futuremark releases 3dmark 2003 patch. nVidia officially cheated.

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
I'm not going to believe anything about this card until it is on store shelves and a good non-biased review has been done on it.

None of this pro-nvidia ID crap or pro-ATI 3DMark crap.


I want to see a bunch of game bench marks. No AA & AF and all combos from 2x-16x respectively.

I want AF and AA quality comparisions. No more apples to ornage bulls$%t.


A nice non-biased, thorough review of a finished product and finished drivers.

As a consumer is that too much to ask or is there not a site on the net that can handle such a simple task.
 
thank god i didn't get sucked into the nvidia 5900 hype... will now wait and c its true performance against the 9800
 
OC Noob said:
I'm not going to believe anything about this card until it is on store shelves and a good non-biased review has been done on it.

None of this pro-nvidia ID crap or pro-ATI 3DMark crap.


I want to see a bunch of game bench marks. No AA & AF and all combos from 2x-16x respectively.

I want AF and AA quality comparisions. No more apples to ornage bulls$%t.


A nice non-biased, thorough review of a finished product and finished drivers.

As a consumer is that too much to ask or is there not a site on the net that can handle such a simple task.

Sad to say ............ for some sites you are indeed asking too much , 'cause many are biased , lazy or lack knowledge to do proper apples to apples tests . Even big 'respectable' sites swallow then regurgitate blatant PR lies and impossiblities from some companies , which clearly shows laziness , bias , lack of knowledge or combinations of all three.
 
Well nvidia has now made a comment about this issue and they didn't say that they didn't cheat. Their comment, in fact, makes it look like they did cheat:

"Since Nvidia is not part of the Futuremark beta program (a program which costs of hundreds of thousands of dollars to participate in), we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer," the representative said. "We don't know what they did, but it looks like they have intentionally tried to create a scenario that makes our products look bad."

Sounds like they are trying to say that futuremark was cheating them and they don't make any mention of whether they did or didn't implement those 'optimizations', which of course leads me to believe that they did do it.

Read the rest of the story here
 
Too bad the ORB doesn't show ATI running things in 24 bit that NVidia runs in 32.

Too bad the ORB doesn't show links to game boards where people repeatedly have problems running older ATI cards like the 7500 with ATI's latest drivers.

My goodness, you mean to tell me there is more to buying a video card than a 3DMark score?
 
That is in fact a very slanderous statement which can easily be shown to be a blatant lie . I in fact actually doubt that someone at Nvidia could actually make such a silly statement to the press . That is a massive lawsuit and a larger disaster waiting to happen .

I do note that I don't see anyone's name as being responsible for the statement . This is serious for the website who will be liable if they don't show a source .
 
WuChild said:
Too bad the ORB doesn't show ATI running things in 24 bit that NVidia runs in 32.

Too bad the ORB doesn't show links to game boards where people repeatedly have problems running older ATI cards like the 7500 with ATI's latest drivers.

My goodness, you mean to tell me there is more to buying a video card than a 3DMark score?

Actually you will find that NV30 and NV35 default to 16 bit while ATI uses the DX9 standard of 24 bit . So if the ORB decided to show that data it would only look worse for Nvidia .
 
WuChild said:
Too bad the ORB doesn't show ATI running things in 24 bit that NVidia runs in 32....

Please reread the comments made in regards to this statement the last time you said that in this thread, mainly mine.

Perhaps you missed the point but ATi's use of 96bit colour for floating point precision is not cheating and if nvidia wanted to they had the choice of using 96bit colour rather then 128 bit colour as well. Nvidia on the other hand is outright cheating by using multiple detection routines to score higher in a benchmark that they say is irrelevant and useless.
 
Sounds like they are trying to say that futuremark was cheating them and they don't make any mention of whether they did or didn't implement those 'optimizations', which of course leads me to believe that they did do it.

So if the pixel shader stuff had been written to where it wouldnt run in 24 bit, and was forced to run in 32 bit had ATI been out of the beta, the ATI cards would have not even been able to run the test.

But since the pixel shaders are being forced to run at 32 bit precision instead of the optional 16 bit on the NVidia cards, they score lower than ATI's because their cards arent flopping as many numbers.... yes, totally fair.

Instead of just shouting out that NVidia is 'cheating', take a gander at the facts, those ATI scores are higher because they run stupid amounts of pixel shaders with less precision, then when NVidia said 'wait a minute, why not force their test to use our optional 16 bit precision, like a game probably would that would want to give it's players the best possible gameplay for quality?'... all the people see are the headlines "NVidia cheated".

Just because NVidia doesnt want to spend 100 grand on beta participation in a benchmark test that is only one of many, doesnt mean Futuremark needs to screw them with pixel shader rendering.
 
The DX9 standard is 96bit colour (24bits perchannel), that has nothing to do with futuremark or 3dmark 2k3, if a card wants to be DX9 compliant it has to use 96bit or higher. The GFFX can do 128, 64 or 32bit colour precision (Note all of the cards still display images in a mere 32bit colour, this is just floating point colour precision - it is downsampled to 32bit (8bit per channel)). If ATi was not part of the Beta project the minimum would still be 96bit as that was determined by microsoft.

If Nvidia uses 64bit colour precision (16 bit per channel) then it is not DX9 compliant and would not be able to run the DX9 test, therefore Nvidia must run at 128bit FPCP to complete the DX9 test. Again this was nvidia's choice and whether 96bit or 128bit colour are used it has virtually no effect on the final 32bit image, most people say ATi has better image quality anway.

EDIT:

I found a review of the radeon 9800 256MB Vs the FX5900 Ultra with 3dmark 2k3 AND 3dmark 2k3 build 330, the results speak for themselves: Linky
 
Last edited:
To me, it seems that nvidia was so confident in their cards in 2002, that they thought they’d one up ati with 128-bit precision and go beyond the dx9 spec (like their dx9+ pixel shaders and the .13 core). Then when they had a look at the 9700, they tried to increase performance with the dust buster, and gave up. The 5900 is based off the same core, so it’s stuck with the same precision. They have nothing to decisively gain back the 3d performance crown at this time so the best they can do is discredit 3dmark and try to inflate their scores at the same time. It seems that their recent problems are a product of their own arrogance.

Of course most of us know that a 3dmark score isn’t everything, but it still counts for something. I’m sure that most people (who aren’t brainwashed by marketing/branding) looking at a high end card are familiar with 3dmark and will take a look at the card’s 3dmark scores and weigh it along with other performance comparisons.
 
WuChild ............. I am not sure where you are reading your info , but I am sorry to say that it is incorrect . 96 bit is the Direct X 9 standard as decreed by Microsoft a long time ago , it is Nvidia who decided that they wanted to go beyond . ( mind you ATI can do 128 bit in some situations ) It is therefore ludicrous for Nvidia or anyone to complain that their 128 bit is slower and therefore everyone is cheating or unfair :rolleyes: . It is their fault for ignoring the standards !!

The DX9 spec states that for a card or driver to be certified it must be at least capable of 24bit precision . 16 bit , also known as partial precision is only allowed in a few situations . But the Nvidia geniuses decided to put 16 bit and 32 bit as their settings . Again deviating from the norm , they cry foul when they have to run 32bit and ATI beats them and then try to bend the rules to allow 16 bit as their default .

The whole beta partner thing is ridiculous ! Nvidia was a member until just before the benchmark was released then pulled out because they were unable to force the INDUSTRY ( not just Futuremark ) to use their backdoor routines and decrease and dilute the directX 9 standards which everyone agreed would be the basis for the DirectX 9 benchmark .

Nvidia's whole problem stems from narcissistic arrogance , they believed that they had the power to ignore the industry , standards , consumers , gamers and competitors . They felt that no one could ever challenge their lofty position on top of the Vcard hill ........... the 9700 Pro changed that ! They further felt that they would always be the top dog and be able to bully the world into bending the rules for them and turning a blind eye to their errors and behaviour . Well they are clearly wrong ! They don't have the clout of Microsoft , Intel or IBM and in fact even these companies can't push the hardware world around like that . Nvidia had obvious delusions of grandeur . Imagine that little Nvidia is even trying to force Bill Gate's hand ............ roflol . So now that such tactics aren't working they have resorted to cheating .......... oops driver bugs , calling everyone else unfair , calling everyone else cheats , trying to discredit benchmark programs which they don't like , pressuring reviewers into 'the way it is meant to be benchmarked ' , sticking "The way it is meant to be played " on titles that actually do better on ATI cards , telling lies galore , false advertising , the dustbuster , proprietary extensions and cinefx , begging and bribing developers to disregard Direct X and OpenGl standards and of course overcharging customers for underperforming vcards with poor IQ and cheating drivers ! ( oops buggy )

The good thing is that there is nothing there that I cannot backup with facts . Nvidia is a spoilt and lying child that needs to simply shut up and improve their graphics division .

Edit: Movax ........... I agree with you .
 
WuChild ............. I am not sure where you are reading your info , but I am sorry to say that it is incorrect

Which part is incorrect?

So according to Futuremark, ATI is a better card because it runs things in 96 bit color instead of 128. But people don't see this in the score. Now my question is, do you think NVidia is stupid for dealing with 128 bit color that nobody can see the differance in anyway vs 96 bit? Or do you think ATI is smart for not doing more than they have to?
 
WuChild said:

So according to Futuremark, ATI is a better card because it runs things in 96 bit color instead of 128.

No. It runs things in accordance with the dx9 spec. It was nvidias choice to go to beyond the spec. :rolleyes:

WuChild said:

Now my question is, do you think NVidia is stupid for dealing with 128 bit color that nobody can see the differance in anyway vs 96 bit? Or do you think ATI is smart for not doing more than they have to?

I don't feel like typing right now, so I'm just going to say yes to both questions.
 
Yes Nvidia is made a very bad decision going with 128 bit instead of having a 96 bit fallback . If they had a decent 24/96 bit fallback and then went ahead with a decent performing proprietary 32/128 bit setting I would be loudly praising them . But adding extra features while not supporting or diluting the standards is not a very bright idea . Furthermore if even with these cheats and decreased IQ settings they still lose in a significant number of tests then things look even worse .

And to be truly correct Futuremark doesn't get into the whose card is better arguments . It is up to the card maker to follow the INDUSTRY STANDARDS and let the benchmark do the talking rather than irate PR executives or fanboys .

EDIT : And by the way I am still waiting for anyone to challenge the veracity of my statements above ( / dons asbestos suit )
 
ATI responds to the accusation that they also cheated :

" The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST. "

I got this from beyond3d.com's forums .

Now that is the sort of answer a consumer should expect from a hardware company . What ATI is saying there is more inline with an optimisation than cheat , but they have decided to pull it from future drivers .
 
Interesting, it seems to me that ATi's optimizations were real (read not a cheat) and it's kind of sad that they won't be optimizing for 3dmark in the future. If you understand what Nvidia's drivers did and what ATi's drivers did then you probably won't argue with that.

I just checked Tom's to see what they had to say about all of this and they had that ATi quote along with the full Nvidia quote, apparently the quote that I found earlier was missing one sentence (in red):

We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.

Hmm, don't know what review they were reading but that's not exactly what I saw (I'm not denying that the 5900 is quite competative with the 9800).

Tom's article was pretty good and worth a read, here it is. They had some interesting things to say, such as:

Depending on which sources you listen to, Nvidia while still participating in the Futuremark Beta program had access to the last code drop before the final code was released for 3DMark03. Several sources have seemed to indicate that the differences between the last code that Nvidia has access to and the final build where very minimal.

which, of course, backs up what CowboyX said and also makes you think... if nvidia was a part of the beta team then why are they saying it's too expensive to be a member, I mean if they were a member they obviously already paid to be a member. So rather then trying to be a part of the beta team they just let futuremark keep nvidia's money and walked away...now that is what I would call a waste of money although we all know 100 grand is pocket change to both nvidia and ATi.
 
I have come to the point where benchies mean little anymore.If this cheating is true shame on them but nobody knows for sure if ati is cheating either intil they get busted.Ati boys insert flamings here - my point is we should not hing our buying on weather this one scores 5000 and that one scores 5010 but rather base it on the whole package.I like my 4800se turbo very much but next one just might be an ati and not because of a score.
 
So looking back to PreservedSwines post on the first page, what nVidia are doing is optimisng the benchmark? as you said they are clipping the sky and using a more efficient pixel shader...so basically what has happened is that futuremark have made a crap benchmark, because its not in the least bit optimised... what could happen is they could just have a large flat wall, and put loads of things behind it which render using loads of inefficient pixel shaders that you never see and call it benchmark?

As far as im concerned there is nothing wrong with optimising anything, it can only be a good thing, it just seems its only nVidia that have taken the initiative.
 
MetalStorm said:
what could happen is they could just have a large flat wall, and put loads of things behind it which render using loads of inefficient pixel shaders that you never see and call it benchmark?
Yes they could have. The purpose of the benchmark is not to provide pretty pictures, it's to subject video cards to the exact same stress to see how they perform relative to each other. NV cheated by avoiding some of the benchmark's workload. What NV did is akin to a marathon runner taking an illegal shortcut instead of running the whole race.
 
Back