- Joined
- Feb 14, 2003
- Location
- Southeast US
Before you get too far into this, know first that I consider the methods employed by nVidia strictly for the 3dmark03.exe (exclusion of all non-visible data) as closer to 'cheating' than anything else, but the whole thing raised some questions in my mind.
Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.
Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.
Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.
Now I'm wondering if this is a bad thing for games.
The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."
It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?
Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.
Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:
1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.
2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. )
So here are my questions:
If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?
After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?
Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:
"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?
Instead, the appearance really was bad.
Note, this thread is NOT a bashing thread in either direction, if you're spoiling for more fights or a soapbox, go away.
Most of us remember 'quack.exe'. We remember how Quake 3 was a very popular enthusiast benchmark, and we remember the distaste we all experienced when the whole quack/quake thing was revealed, and ATI was 'busted'.
Recently, ATI admitted to doing something similar for 3dmark03, specifically, shuffling instructions around to take advantage of their architecture to make shader operations more efficient.
Now I'm wondering if this is a bad thing for games.
The ATI dude said "These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance."
It seems to me that some of the 'cheats' are merely reactions to different methods employed by games to get an image on the screen. If, say, MOH:AA used a slightly different algorithm than, say, Jedi Knight 2, wouldn't you want optimizations for each if you played both?
Personally, I'd want 'intelligent' drivers that recognized that each game handled certain tasks a little differently, and compensated for it.
Note, I do not condone nVidia's or ATI's 'tweaks' of their drivers to get inflated results from 3dmark03, and here's why:
1. Optimizing a driver for a bona fide application people will buy is a good thing, because people can go home and duplicate the results. By switching drivers, they can get better performance from that application.
2. Optimizing a driver for a benchmark gives people erroneous expectations. People assume they can switch to the new driver and get similar results with their applications, not the benchmark. That's because a benchmark is supposed to be an indication of all around performance (even though frequently they're not.), and not an application unto themselves. Not too many peope eagerly await quitting time to go home and spend a marathon 'benchmarking session.' (There are some, but they're not the norm. )
So here are my questions:
If nVidia or ATI had done a similar 'tweak' to a different engine, say, the Unreal II engine, or the Quake III engine, that resulted in a 25% boost to performance with no IQ drop, in that application ONLY would you find that fishy? Or good?
After my explanation above, does anyone think that what nVidia did is not bad, or at least harmful to the consumer?
Personally, I have to confess that I went along with the bandwagon back in the quack days, and lambasted ATI. Now I think they were just onto something. I think they should have done the same thing with more popular titles to solidify their position, then someone AT ATI could have said:
"But Kyle, you can do similar things with these other popular PC titles, and get similar results. So what?
Instead, the appearance really was bad.