Notices

Overclockers Forums > Hardware > Video Cards > General GPU
General GPU
Forum Jump

Thoughts / ?s on driver 'cheating'

Post Reply New Thread Subscribe Search this Thread
 
 
Thread Tools
Old 06-07-03, 02:04 PM Thread Starter   #1
InThrees
Member

 
InThrees's Avatar 

Join Date: Feb 2003
Location: Southeast US

 
Thoughts / ?s on driver 'cheating'


Before you get too far into this, know first that I consider the methods employed by nVidia strictly for the 3dmark03.exe (exclusion of all non-visible data) as closer to 'cheating' than anything else, but the whole thing raised some questions in my mind.

Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.

Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.

Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.

Now I'm wondering if this is a bad thing for games.

The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."

It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?

Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.

Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:

1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.

2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. )

So here are my questions:

If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?

After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?

Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:

"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?

Instead, the appearance really was bad.
InThrees is offline   QUOTE Thanks
Old 06-07-03, 02:37 PM   #2
funnyperson1
Senior Member

 
funnyperson1's Avatar 

Join Date: Jun 2001
Location: Northern VA

10 Year Badge
 
I wouldnt have a problem with what Nvidia did if there were no IQ drop. Unfortunately what Ati did only gave a 2.5% increase, what Nvidia did gave a 25% increase.

I don't consider what Ati did with 3dmark to be cheating. But I do consider what they did with Quake to be cheating and what Nvidia did with 3dmark to be cheating.

__________________

"Strong with the fold this boy is, fold you must"-NASSoccer
its a shame you cant ban people for being ignorant fanboys....
My Heatware
[Gigabyte MA790X-UDP4][PHII 940BE][Thermalright Ultra90][2x2GB OCZ Reaper HPC][Palit HD4850][Enhance 5150GH][Windows 7,Ubuntu 12.04]
funnyperson1 is offline   QUOTE Thanks
Old 06-07-03, 03:11 PM   #3
timmyqwest
Disabled

 
timmyqwest's Avatar 

Join Date: Nov 2002
Location: illinois

 
I'm very with you, the problem i have is that you KNOW that their goal isnt to make games faster, it's to make benchmarks faster.

I wasnt around (in computers) when ATI "cheated" but considering 3dmark has been the main benchmark for many years i can presume that this was at the point where Nvidia was king. So what did ATI do? aparently they cheated.

The FX was...not up to par, i think we can also all agree on that. Thus Nvidia apeared to be de-crowned. So what did they do? they cheated.

And for what you ask? So i'm more willing to buy their product. I can see your theory here, yes it may help in games. But they didnt do it like that, from what i understand the only thing they improved was a benchmark. And from what i also understand it was for a loss in image quality. Now i'm only running a ti4600, and i dont have frame rate issues, so if they want to improve anything...give me some better looking graphics.

that is my two cents...
timmyqwest is offline   QUOTE Thanks
Old 06-07-03, 05:01 PM   #4
OC Noob
Member

 
OC Noob's Avatar 

Join Date: Jun 2002
Location: Phoenix, AZ USA

 
Re: Thoughts / ?s on driver 'cheating'


Quote:
Originally posted by InThrees








After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?


Of course its bad for the consumer. A lot of people use series of benchmarks to evaluate video cards. If we as consumers allow this then video cards might as well be like PSUs where companies can put any ridiculous number on the side and say its true.

550w PSU that really only puts out an average of 150w to go along with the video card that says it gets 500 FPS in DIII when it really only gets 25 with normal quality.

Benchmarks aren't perfect, but they are all we have and if everyone starts doing these kinds of things then we won't even have that.

Wether its QuakeIII or 3dmark it hurts consumers.

__________________
Hail to the King:
Opteron 165 w/ DFI Ultra-D w/ BBA 1900XT 512 mb GSkill PC4200 1 GB x 2
74 gb WD Raptor x 2 Raid 0 MSI (ATI550) Tuner
w/ Windows XP Media Center Edition & OCZ Powerstream 520
DD TDX H2O block w/ Maze4 GPA block DD D4 pump
single 120mm Fan Heater core w/ shroud In Lian-Li fish tank window case

RIP (Rest In Pieces):
P4 3.0 ghz @ 3.75 ghz Aerocool HT-101 IC7-G
Radeon 9800 Pro 430 W Antec True Power
OC Noob is offline   QUOTE Thanks
Old 06-07-03, 05:34 PM   #5
ninthebin
Member

 
ninthebin's Avatar 

Join Date: Mar 2002
Location: Liverpool, UK

 
Games

great, bring it on - I want my games to run as fast as possible, you may argue that this doesnt show you how the card actually performs, well what do you know...there is that very card doing it for its optimizations.

let me stop you there though I want them to look good to...

I have an option already within my driver letting me pick IQ/performance on a handy little slider - what use is this if the driver is making these decisions anyway.

benchmark

I think this totally undermines benchmarks, makes their value to the community very fickle. Sure many may already say that benchmarks dont really provide accurate results and they would rather see a plethora of ingame benchmarks instead, but they are still there and they are still benchmarking systems, on their side, fairly.

As they are a very controlled environment we have seen how easy it is to exploit this to certain companies own ends.

conclusion

I have been trying to draw the line on where I think optimisations and cheating truely seperate.

my own personal view is that if you are using features of the card that the game is not that will boost performance...you are optimising

if you are just sacrificing image quality without users consent to get that extra edge of competition, for me, cheat.

ahem,
Dan
ninthebin is offline   QUOTE Thanks
Old 06-07-03, 05:39 PM   #6
funnyperson1
Senior Member

 
funnyperson1's Avatar 

Join Date: Jun 2001
Location: Northern VA

10 Year Badge
 
Quote:
Originally posted by timmyqwest
I'm very with you, the problem i have is that you KNOW that their goal isnt to make games faster, it's to make benchmarks faster.

I wasnt around (in computers) when ATI "cheated" but considering 3dmark has been the main benchmark for many years i can presume that this was at the point where Nvidia was king. So what did ATI do? aparently they cheated.

The FX was...not up to par, i think we can also all agree on that. Thus Nvidia apeared to be de-crowned. So what did they do? they cheated.

And for what you ask? So i'm more willing to buy their product. I can see your theory here, yes it may help in games. But they didnt do it like that, from what i understand the only thing they improved was a benchmark. And from what i also understand it was for a loss in image quality. Now i'm only running a ti4600, and i dont have frame rate issues, so if they want to improve anything...give me some better looking graphics.

that is my two cents...
When Ati cheated the Radeon 8500 was months too late and it still got beaten in QuakeIII and other benchies by the Geforce3. It was much like the intro of the FX5800 actually.

__________________

"Strong with the fold this boy is, fold you must"-NASSoccer
its a shame you cant ban people for being ignorant fanboys....
My Heatware
[Gigabyte MA790X-UDP4][PHII 940BE][Thermalright Ultra90][2x2GB OCZ Reaper HPC][Palit HD4850][Enhance 5150GH][Windows 7,Ubuntu 12.04]
funnyperson1 is offline   QUOTE Thanks
Old 06-07-03, 05:49 PM   #7
timmyqwest
Disabled

 
timmyqwest's Avatar 

Join Date: Nov 2002
Location: illinois

 
Quote:
Originally posted by funnyperson1


When Ati cheated the Radeon 8500 was months too late and it still got beaten in QuakeIII and other benchies by the Geforce3. It was much like the intro of the FX5800 actually.
Like i said i didnt understand the history of it all but to hear that doesnt suprise me.
timmyqwest is offline   QUOTE Thanks
Old 06-07-03, 05:57 PM   #8
Ununquadium114
Disabled

 
Ununquadium114's Avatar 

Join Date: Apr 2003

 
the whole cheating thing is getting real old
Ununquadium114 is offline   QUOTE Thanks
Old 06-07-03, 08:06 PM   #9
Ugmore Baggage
Member

 
Ugmore Baggage's Avatar 

Join Date: Feb 2002

 
If benchmarks would measure image quality, instead of the almost meaningless (but oh so easy to measure) framerate . Anybody could make a video card that put out a thousand frames of black for the whole 3DMark demo.
Ugmore Baggage is offline   QUOTE Thanks
Old 06-08-03, 04:52 AM Thread Starter   #10
InThrees
Member

 
InThrees's Avatar 

Join Date: Feb 2003
Location: Southeast US

 
Until someone comes up with a mathematical process for evaluating things like hue and blur in screenshots, all IQ evaluations will be by eye.

Problem with that is, some reviewers have dollars obscuring their vision.

I have to agree with ninthebin - optimizations that result in more efficient execution of code, etc, are good things.

"Optimizations" that result in the driver ignoring IQ selections, or taking the easy road solely to boost framerates are not good things.
InThrees is offline   QUOTE Thanks
Old 06-08-03, 06:44 AM   #11
james.miller
Member

 
james.miller's Avatar 

Join Date: Jun 2002
Location: Dunstable, uk

 
Quote:
Originally posted by Ununquadium114
the whole cheating thing is getting real old
that sounds just like something that sombeody who wanted to just brush it under the carpet would say.

For me, its a very real issue. ATI did nothing wrong with quake3 - if they hadnt of tried to hide it, and told people upfront, it would have been excepted, and all would be well.

Nvidia, however have cheated and lied their way through this, and all people want to do is quietly forget about it.

Are nvidia nice people?
Like hell they are

Do they put the interests of their customers beofre themself's?
NO. selling an fx5200 for more than a gf4ti4200 - is that putting the customers first, or is it pull the wool over their eyes?

Do they try and get the most out of every game they can WITHOUT reducing I.Q?
that's a big no

This shouldn't be dropped. If it is, it will keep happening.

__________________
The HTPC

E2160 @ 2.7ghz | Thermaltake The Orb cooler | Asus p5w-dh | 4Gb DDR-II pc6400| 1.00Tb of WD AAKS storage | Asus Xonar D2| 360 HD-DVD drive
Pioneer bd204 bluray sata drive | Corsair HX620w PSU | BFG GeForce 8800GTX | Silverstone SST-LC17b
| 22" dell e228wfp | 40" 1080p SONY 40w2000

ONKYO TX-SR805 | bi-amped mission m71i fronts | eltax center & bipolar rears
2x 12" custom built subwoofers powered by a denon pma-100m | PS3 with linux and a 250gb 3.5" internal drive | Nintendo wii


james.miller is offline   QUOTE Thanks
Old 06-08-03, 10:26 AM   #12
PreservedSwine
Member

 
PreservedSwine's Avatar 

Join Date: Jun 2002
Location: Ft. Myers, Fl.

 
Re: Thoughts / ?s on driver 'cheating'


Quote:
Originally posted by InThrees
Before you get too far into this, know first that I consider the methods employed by nVidia strictly for the 3dmark03.exe (exclusion of all non-visible data) as closer to 'cheating' than anything else, but the whole thing raised some questions in my mind.

Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.

Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.

Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.

Now I'm wondering if this is a bad thing for games.

The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."

It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?

Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.

Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:

1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.

2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. )

So here are my questions:

If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?

After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?

Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:

"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?

Instead, the appearance really was bad.
A few things you should know.

1) Application detection is widley used. Driver tweaks are a normal part of the industry. John Carmak has commented on this as well, along with several othr big-name game programmers. They state what ATI did is exactly the type of optimization used in a game, and Nvidia's actions, if used in a game, would be deplorable.

Quak is still used, and it had been in the (6) driver revisions before it's "discovery." Alot of people only know about half the story, so I'll tell it again, breifly.

Quak caused no texture blurring on the R100, and when the R200 was released, it caused a total of (5) textures to become blurred on the R200 ONLY. ATI didn't notice this, but Nvidia did, and leaked (behind closed doors) the tools and info on how to expose the "cheat" to anyone who would listen to ruin the release of the R200.

Several web-sites saw it for what it was, a bug and not a cheat, and would not run the story as Nvida wanted, but some took the bait. The story was run, and 1st impressions seem to last. In the folowing few days, much damage was done to ATI. The fatcs that followed, no one seemed to care, the damage had already been done, and no one was really interested in what really happened.

The subsequest driver release from ATI still contained the application detection (and still does to this day, though now it's been expanded to run on any game that uses that game engine), fixed thew whopping (5) blurry textures, and also DIDN'T SLOW the fps.

Now, if it was a cheat, don't you think they would have sacrificed qulaity for speed, and not simpy lowered the IQ for no reason? It's pretty obvious to anyone who knows what happened it was a simple bug....a bug that didn't show up on the R100, but b/c of the different architecture on the R200, went unnoticed.

Quote:
If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?
Of course, it's not only good, it's GREAT!!
PreservedSwine is offline   QUOTE Thanks

Post Reply New Thread Subscribe


Overclockers Forums > Hardware > Video Cards > General GPU
General GPU
Forum Jump

Thread Tools Search this Thread
Search this Thread:

Advanced Search


Mobile Skin
All times are GMT -5. The time now is 06:36 AM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
You can add these icons by updating your profile information to include your Heatware ID, Benching Profile ID or your Folding/SETI profile ID. Edit your profile!
X

Welcome to Overclockers.com

Create your username to jump into the discussion!

New members like you have made this the best community on the Internet since 1998!


(4 digit year)

Why Join Us?

  • Share experience
  • Max out your hardware
  • Best forum members anywhere
  • Customized forum experience

Already a member?