• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

what are your thoughts on ATI's new AI feature?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
The whole thing just smells funny to me. A long time ago, many people were at the throats for NVIDIA for their "driver optimizations," that selectively turned features on and off within the prescence of various applications and tests. Throughout this whole time, ATI stood by the fact that they did not use any sort of cheat/optimization/technology and went out of their way to paint NVIDIA as that "evil company."

Now, we have [what I feel] is a comparable equivalent on ATI's part. I am in no way trying to defend NVIDIA for their past actions, but I find it very smug to blast a company for doing something then doing the exact same thing, only with some modifications. Honestly, it seems like the main difference is that ATI gives the user control over this function. If I were part of futuremark or any other 3d benchmarking company, I would not endorse such a product given the skepticism that NVIDIA had recieved for such a tactic. Honestly, this is the second time I can recall ATI resorting to such methods (other than quack.exe) but it appears that this one is a bit less surreptitious. It would be interesting to see how the benchmarking and computing communities take to this "feature," especially to NVIDIA's past yet similar turmoil.

deception``
 
increased framerates with no noticable difference (to me) = a good thing in my book

since it's (now?) and option it's good for the people that need the extra 5% and for the people with killer rigs who can destry doom3 with full quality.
 
well it seems that ATI is once again hearing it's consumers loud and clear, and giving us the option to now turn "off" all of the game spicific optimizations, bug fixes and generic optimizations.
the only problem I see is that once you turn it off, you can enable bugs or problems with a game....
as many of the game spicific optimizations are bug related.
still, unlike with nVidia, we are now able to remove all optimizations if wanted.
to me, this is a good thing...even if it can hurt us with poorer IQ or performance.
choice is always best....and now we will have it.

DH said:
For those of you interested in specifics, Catalyst AI uses ATI's Texture Analyzer technology (R9600 Series and R4xx series) to optimize performance in any Game/3D application. ATI believe that whilst doing this they maintain the correct image quality and in some cases can even improve IQ. Cat AI does this by analyzing individual textures as they are loaded in order to choose the best and fastest way for them to be displayed. Settings available to the user are Off, Standard and Advanced. By default the drivers are set to Standard. In most cases Standard should be sufficient as it uses less CPU overhead than Advanced and its up to the end user to decide which option works best on their system in each particular game or application. There should be no IQ difference between the two settings however in games where there are frequent texture loads or when using a slower system the extra computations may cancel out the performance increases gained by using the Advanced algorithm.

As well as the texture optimization algorithm there is a second aspect to Catalyst AI and that is the application specific optimizations and tweaks. Examples of these application specific items are forcing Anti Aliasing off in the driver for Splinter Cell or Prince Of Persia because AA doesn’t work in those titles. ATI have informed us that they will never specifically detect a synthetic benchmark with Catalyst AI optimizations however some benchmarks may see improved scores due to using game engines that have improvements within the driver. ATI have also guaranteed that they will only optimize if they can do so without any reduction in Image Quality.

Just to be clear, disabling Catalyst AI disables application specific optimizations, bug fixes and generic optimizations.

http://www.driverheaven.net/articles/atiop/



deception`` said:
The whole thing just smells funny to me. A long time ago, many people were at the throats for NVIDIA for their "driver optimizations," that selectively turned features on and off within the prescence of various applications and tests. Throughout this whole time, ATI stood by the fact that they did not use any sort of cheat/optimization/technology and went out of their way to paint NVIDIA as that "evil company."

Now, we have [what I feel] is a comparable equivalent on ATI's part. I am in no way trying to defend NVIDIA for their past actions, but I find it very smug to blast a company for doing something then doing the exact same thing, only with some modifications. Honestly, it seems like the main difference is that ATI gives the user control over this function. If I were part of futuremark or any other 3d benchmarking company, I would not endorse such a product given the skepticism that NVIDIA had recieved for such a tactic. Honestly, this is the second time I can recall ATI resorting to such methods (other than quack.exe) but it appears that this one is a bit less surreptitious. It would be interesting to see how the benchmarking and computing communities take to this "feature," especially to NVIDIA's past yet similar turmoil.

deception``


deception,

I think you have this all wrong.

first, nVidia still is using application detection to inhance benchmark results, while lowering at times the IQ of the game.
in other words, when a DX game (like farcry) gets a boost in performance, we rarly ever see other games getting the same or simmiler boost.
this wouldn't be bad if 9 out of 10 times we find out that IQ is the same or better....yet this dosn't happen often, and mostly it's worse (IQ).

now let's take a look at what ATI has done....

ATI has used to this date an algorithm that improves triliner filtering performance by default.
there never was any way to turn this off....now there is.
(what may I ask smells fishy to you at this point?)


then there is the fact that some games will need application detect (for both ATI and nVidia) to remove bugs or problems.
I don't see anything wrong with ATI or nVidia using an app detect to help it's consumers with better IQ or compatibility.
yet only with ATI, can you now turn it off if you wanted to.

beyond3d said:
The applications ATI are currently detecting are: Doom3, UT2003, UT2004, Half Life 2 engine, Splinter Cell, Race Driver, Prince of Persia and Crazy Taxi 3 - some of these titles were actually already detected by ATI's drivers since they have a bug with operating with AntiAliasing and ATI have disabled AA operating on these titles.

http://www.beyond3d.com/misc/catai/index.php?p=3


now let's take a look at Doom3 for a moment.
it seems that with the new cat4.10 beta drivers, we now have the "humus tweek" not only added to the driver, but inabled by defalt.
this tweek now does many of the shaders, using math, instead of the default "texture look up" that seems to only benifit nVidia's older cards.

I like what is writen at THG...
THG said:
What looks like a great and pragmatic idea at first sight gets right to the heart of the shader-replacement problem. ATi speaks of a mathematically equivalent result, not an exactly identical one. The trouble is that only the company in question knows what their definition of "equivalent" is and what the result is in a real-world scenario. Since application detection uses only specific optimizations, i.e. only for certain games, the result can sometimes be more and sometimes less "correct." This makes it impossible to draw any conclusion that would apply to all games.

what Lars Weinand would like you to believe, is that the IQ can be worse with this type of tweak.
the real problem is that Lars Weinand doesn't know what he is talking about.
since a texture look up, is always an aproximation of what the light sources should do or look like.
when doing it in math, you now have what it should always be....
in other words, a correct value of what the light source should look like.

when a tweak like this is used, it inhances the IQ, as well as the performance...what is the problem with that?
better yet, it can be turned off, unlike what nVidia's doing with thier drivers.
again, I see nothing wrong with ATI AI sofar.

DH said:
Of all our tests Doom3 shows the largest change in performance for ATI's drivers, this is mainly due to the additional game specific optimizations within the driver. This includes tweaks such as replacing the lighting shader, which is based on a look-up table, with a mathematically precise lighting shader that not only significantly improves performance but also renders a more mathematically correct scene.

as for 3dmark tests....
beyond3d said:
ATI are also keen to stress that only application where the end user will get some benefits will be optimised - benchmark only applications should not be targeted under Catalyst AI.
let's wait and see on this one.


some more quotes since many people will not read all the articals:

DH said:
The first thing that strikes me about the results we gathered is the small performance increases which are achieved by ATI with their optimizations. On one hand it seems so small that’s its not really worth talking about...so small infact that it could just be down to natural margin of error in some cases. On the other hand it could be looked at in the way that ATI's optimized setting is infact much closer to Nvidia's non optimized results in terms of comparison articles than every one thought up to this point. With a margin so small it almost made us consider leaving our review of Nvidia's no-optimizations Vs ATI's with Optimizations and not updating it. Overall though its great that ATI now offer the user control over optimizations, even if its not worth their while disabling them. Looking at the IQ shots this certainly seems to be the case, there is no difference in IQ seen by the end user between off and standard yet you get some excellent tweaks – like the improved calculations (and therefore performance increases) in Doom3.

DH said:
does seem to us that Nvidia's optimizations are aimed more at gaining raw performance across the board where as ATI's are aimed at improving all aspects of the gaming experience. (We'd welcome detailed information on app specific optimizations from Nvidia though if it shows they are improving things in the same way) Its also worth mentioning that ATI give you the option to disable all optimizations, where as when we disabled the Nvidia optimizations it was texture/detail optimizations only. Any Application specific optimizations cannot be disabled in the Nvidia drivers and are therefore causing the Nvidia results to be higher. It would be nice to see Nvidia take steps to allow all optimizations to be turned off in future drivers. In a way, as things stand we still cant get an Apples to Apple comparison between competing cards however we are much closer than before...and that will remain the case as long as Nvidia don’t allow the option to disable all optimizations.

just my thoughts, as well as a few others.

mica
 
To me its like a cheat to others, its a so called optimzation that helps people with non uber rigs. In the Bit-tech review of the x700xt when the AI was turned to high in Aquamark it got 1k higher score.
 
So far AI is a very good feature once unlike Nvidia we are told what is going on in the optimisations and always have the option to turn it off or tweak it . But it is still early in the feature's life so I'll be watching .
 
Back