• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Thoughts / ?s on driver 'cheating'

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

InThrees

Member
Joined
Feb 14, 2003
Location
Southeast US
Before you get too far into this, know first that I consider the methods employed by nVidia strictly for the 3dmark03.exe (exclusion of all non-visible data) as closer to 'cheating' than anything else, but the whole thing raised some questions in my mind.

Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.

Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.

Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.

Now I'm wondering if this is a bad thing for games.

The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."

It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?

Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.

Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:

1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.

2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. ;))

So here are my questions:

If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?

After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?

Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:

"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?

Instead, the appearance really was bad.
 
I wouldnt have a problem with what Nvidia did if there were no IQ drop. Unfortunately what Ati did only gave a 2.5% increase, what Nvidia did gave a 25% increase.

I don't consider what Ati did with 3dmark to be cheating. But I do consider what they did with Quake to be cheating and what Nvidia did with 3dmark to be cheating.
 
I'm very with you, the problem i have is that you KNOW that their goal isnt to make games faster, it's to make benchmarks faster.

I wasnt around (in computers) when ATI "cheated" but considering 3dmark has been the main benchmark for many years i can presume that this was at the point where Nvidia was king. So what did ATI do? aparently they cheated.

The FX was...not up to par, i think we can also all agree on that. Thus Nvidia apeared to be de-crowned. So what did they do? they cheated.

And for what you ask? So i'm more willing to buy their product. I can see your theory here, yes it may help in games. But they didnt do it like that, from what i understand the only thing they improved was a benchmark. And from what i also understand it was for a loss in image quality. Now i'm only running a ti4600, and i dont have frame rate issues, so if they want to improve anything...give me some better looking graphics.

that is my two cents...
 
InThrees said:








After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?



Of course its bad for the consumer. A lot of people use series of benchmarks to evaluate video cards. If we as consumers allow this then video cards might as well be like PSUs where companies can put any ridiculous number on the side and say its true.

550w PSU that really only puts out an average of 150w to go along with the video card that says it gets 500 FPS in DIII when it really only gets 25 with normal quality.

Benchmarks aren't perfect, but they are all we have and if everyone starts doing these kinds of things then we won't even have that.

Wether its QuakeIII or 3dmark it hurts consumers.
 
Games

great, bring it on - I want my games to run as fast as possible, you may argue that this doesnt show you how the card actually performs, well what do you know...there is that very card doing it for its optimizations.

let me stop you there though I want them to look good to...

I have an option already within my driver letting me pick IQ/performance on a handy little slider - what use is this if the driver is making these decisions anyway.

benchmark

I think this totally undermines benchmarks, makes their value to the community very fickle. Sure many may already say that benchmarks dont really provide accurate results and they would rather see a plethora of ingame benchmarks instead, but they are still there and they are still benchmarking systems, on their side, fairly.

As they are a very controlled environment we have seen how easy it is to exploit this to certain companies own ends.

conclusion

I have been trying to draw the line on where I think optimisations and cheating truely seperate.

my own personal view is that if you are using features of the card that the game is not that will boost performance...you are optimising

if you are just sacrificing image quality without users consent to get that extra edge of competition, for me, cheat.

ahem,
Dan
 
timmyqwest said:
I'm very with you, the problem i have is that you KNOW that their goal isnt to make games faster, it's to make benchmarks faster.

I wasnt around (in computers) when ATI "cheated" but considering 3dmark has been the main benchmark for many years i can presume that this was at the point where Nvidia was king. So what did ATI do? aparently they cheated.

The FX was...not up to par, i think we can also all agree on that. Thus Nvidia apeared to be de-crowned. So what did they do? they cheated.

And for what you ask? So i'm more willing to buy their product. I can see your theory here, yes it may help in games. But they didnt do it like that, from what i understand the only thing they improved was a benchmark. And from what i also understand it was for a loss in image quality. Now i'm only running a ti4600, and i dont have frame rate issues, so if they want to improve anything...give me some better looking graphics.

that is my two cents...

When Ati cheated the Radeon 8500 was months too late and it still got beaten in QuakeIII and other benchies by the Geforce3. It was much like the intro of the FX5800 actually.
 
funnyperson1 said:


When Ati cheated the Radeon 8500 was months too late and it still got beaten in QuakeIII and other benchies by the Geforce3. It was much like the intro of the FX5800 actually.

Like i said i didnt understand the history of it all but to hear that doesnt suprise me.
 
If benchmarks would measure image quality, instead of the almost meaningless (but oh so easy to measure) framerate . Anybody could make a video card that put out a thousand frames of black for the whole 3DMark demo.
 
Until someone comes up with a mathematical process for evaluating things like hue and blur in screenshots, all IQ evaluations will be by eye.

Problem with that is, some reviewers have dollars obscuring their vision.

I have to agree with ninthebin - optimizations that result in more efficient execution of code, etc, are good things.

"Optimizations" that result in the driver ignoring IQ selections, or taking the easy road solely to boost framerates are not good things.
 
Ununquadium114 said:
the whole cheating thing is getting real old

that sounds just like something that sombeody who wanted to just brush it under the carpet would say.

For me, its a very real issue. ATI did nothing wrong with quake3 - if they hadnt of tried to hide it, and told people upfront, it would have been excepted, and all would be well.

Nvidia, however have cheated and lied their way through this, and all people want to do is quietly forget about it.

Are nvidia nice people?
Like hell they are

Do they put the interests of their customers beofre themself's?
NO. selling an fx5200 for more than a gf4ti4200 - is that putting the customers first, or is it pull the wool over their eyes?

Do they try and get the most out of every game they can WITHOUT reducing I.Q?
that's a big no

This shouldn't be dropped. If it is, it will keep happening.
 
InThrees said:
Before you get too far into this, know first that I consider the methods employed by nVidia strictly for the 3dmark03.exe (exclusion of all non-visible data) as closer to 'cheating' than anything else, but the whole thing raised some questions in my mind.

Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.

Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.

Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.

Now I'm wondering if this is a bad thing for games.

The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."

It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?

Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.

Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:

1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.

2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. ;))

So here are my questions:

If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?

After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?

Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:

"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?

Instead, the appearance really was bad.

A few things you should know.

1) Application detection is widley used. Driver tweaks are a normal part of the industry. John Carmak has commented on this as well, along with several othr big-name game programmers. They state what ATI did is exactly the type of optimization used in a game, and Nvidia's actions, if used in a game, would be deplorable.

Quak is still used, and it had been in the (6) driver revisions before it's "discovery." Alot of people only know about half the story, so I'll tell it again, breifly.

Quak caused no texture blurring on the R100, and when the R200 was released, it caused a total of (5) textures to become blurred on the R200 ONLY. ATI didn't notice this, but Nvidia did, and leaked (behind closed doors) the tools and info on how to expose the "cheat" to anyone who would listen to ruin the release of the R200.

Several web-sites saw it for what it was, a bug and not a cheat, and would not run the story as Nvida wanted, but some took the bait. The story was run, and 1st impressions seem to last. In the folowing few days, much damage was done to ATI. The fatcs that followed, no one seemed to care, the damage had already been done, and no one was really interested in what really happened.

The subsequest driver release from ATI still contained the application detection (and still does to this day, though now it's been expanded to run on any game that uses that game engine), fixed thew whopping (5) blurry textures, and also DIDN'T SLOW the fps.

Now, if it was a cheat, don't you think they would have sacrificed qulaity for speed, and not simpy lowered the IQ for no reason? It's pretty obvious to anyone who knows what happened it was a simple bug....a bug that didn't show up on the R100, but b/c of the different architecture on the R200, went unnoticed.

If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?
Of course, it's not only good, it's GREAT!!
 
Back