Filtering Or Cheat?

I had some views and questions on ATI’s new adaptive filtering method ( now called "trilinear")and initially started this thread : https://www.overclockers.com/forums/showthread….light=brilinear . It later became clear to me this was a good thing and I contentedly withdrew from wasting time in debates over it .

In the last week several hardware sites have decided to ‘educate’ the masses on the issue . Some do a good job while the others are just so blinkered that it just isn’t funny . So I decided to post my views on the matter after looking through the now wealth of reviews, editorials and tech papers on filtering.

Is this a cheat, or a good optimisation?

A Result, Not A Method

First, unlike the stuff posted at some sites, there is no such thing as a defined standard for performing trilinear filtering.

What is a reasonable definition of “trilinear filtering?”

A trilinearly filtered 3d scene is one in which the scene’s mip map boundaries are not visible either statically or in motion, and should also remove the wave/moving line effect of bilinear filtering . The way that this additional filtering is added to bilinear filtering is as far as I see it, fairly standard.

What is important to note is that trilinear filtering is an effect or end state or result, not a method. Furthermore, there are few definitions to be found which tie one into a given method eg full scene or not.

To make this easy, just remember this: Visible mip map boundaries or transitions in gameplay = No trilinear filtering.

The “what” is one thing. The “how” is quite another. How do you “honestly” achieve the effect?

A “classic” display driver would filter the entire scene, not just the relevant mip maps boundaries. That would incur a performance hit.

Could it be possible that what is being called “honest” processing is actually just an old, dumb way of doing something, doing a lot of work for no purpose? Bear in mind that trilinear filtering is already several times more intensive than bilinear filtering, so there’s a desire/need to optimize the extra work to get the most bang for the GPU buck.

Given that, what then become the “honest” as opposed to the “dishonest” way of handling this?

The nVidia Way…

The nVidia Way

Many people have placed Nvidia’s much maligned adaptive method in the same boat as ATI’s but I think they are quite distinct and will deal with those differences now.

Nvidia’s method, called "brilinear" filtering by many, offers a good performance boost and decent enough image quality for most gamers/reviewers, but its main flaw is that it often doesn’t produce the trilinear effect. That is, brilinear filtering often doesn’t properly blend or filter the mip map boundaries.

That means there are several games/situations where there is perceptibly decreased image quality. This decreased image quality varies from barely noticeable to downright distracting in actual gameplay.

Visible mip map boundaries or transitions in gameplay = No trilinear filtering.

That is why myself and others have no problem in refuting Nvidia’s initial statements that it was true trilinear, Remember trilinear is an effect/result/endpoint. No feature effect, no feature.

Is brilinear a bad thing??? No, at least not intrinsically. It could be a fantastic speed optimisation for gamers who:

  1. just want maximun framerate no matter what .
  2. can’t see the differences between brilinear and trilinear, or consider brilinear’s quality good enough for them .
  3. have a game/computer or both that cannot run at a preferred resoloution or setting with ‘full’ trilinear enabled at their preferred framerate.

The last is currently very important for the low-end and will be important in the future as today’s FX cards move from highend to midrange to the low-end . It therefore can add prolonged life to such cards in the future at a quality above plain bilinear filtering.

However, there are a few major problems with this approach:

  1. nVidia’s driver gave you “bri,” not “tri.”

  2. nVidia’s drivers happily told you that you got tri when you were actually getting bri.

  3. nVidia and some reviewers initially denied there was a difference in image quality between bri and tri.

  4. Eventually, nVidia admitted that the quality was lower and promised us via Kyle at [H] that true trilinear would be enabled soon.

The truth is that up to now all GeForce FX users are stuck with this reduced quality . Even when the cards can handle it, trilinear is not an option!

I am certain that there are many games which a 5800 to 5950u level card could happily handle with trilinear on, but the option is not there . You must then ask “Is this there to benefit the gamer, or nVidia’s benchmarketing?

The ATI Way…

The approach ATI takes towards "trilinear" filtering is pretty simple.

Common sense dictates that you’ll only see visible boundaries between mip map A + B visible when there are fairly big differences or gaps between the two. What ATI claims to do is to calculate those differences and blend/filter the boundaries differntly depending on how different they are from each other.

So if A + B are grossly different and will be visible to the gamer, then maximum filtering gets applied. If, however, maps D + E in the same scene are quite similar, less filtering is needed to hide the edges, so less filtering is done.

So far, ATI’s method it has been shown to result in properly blended mip map boundaries in gameplay . One more time: No visible mip map boundaries or transitions in gameplay = Trilinear Filtering

What about colored mip maps? Some titles like COD, Quake3 and UT2003 show X800 performance losses with coloured mip map boundaries.

For those of you who don’t know about this, the colouring of mip maps looks like this : http://hardocp.com/images/articles/…MOmeutS_3_6.jpg http://www.tomshardware.com/graphic…mages/pic12.jpg .

A properly filtered border (trilinear) will merge with the next making it very difficult to see where one colour or mip map ends or begins relative to the other.

Colored mip maps helped to prove the brilinear nature of Nvidia’s filtering last year.

Many surmised that ATI’s performance loss was due to ATI’s driver detecting that the gamer/reviewer had turned on coloured mip maps to check filtering quality and turning off the optimisation to render the older ‘full scene’ trilinear to cheat the public into thinking this is how it worked all the time.

What is really happening here is that the sharply and distinctly coloured mip maps will present to the ATI driver as a situation where the boundaries have the maximum possible differences . As a result, maximum filtering is needed all the time, not just sometimes. A more challenging scene, more work, logically, performance must drop.

I see no big conspiracy here, though others see this differently.

What I do see is ATI being less than forthcoming about what they did:

  1. They didn’t properly explain to reviewers or even developers the differences between the x800’s ( and 9600’s ) filtering method and older methods. In fact, you would have to dig very deep to find some of the statements by ATI emmployees stating that the filtering method would be changing with this new generation.
  2. ATI says they were less than forthcoming because they want to patent the technology. That’s a plausible explanation, but we will have to wait and see.
  3. Some of their subsequent statements on the issue have not been sufficiently clear.

ATI does not come out of this smelling like a rose. Just as with the Quack issue, ATI has in my opinion failed to handle the situation well, although they may have been in one of those damned if you do and damned if you don’t situations. Communication is key :

(Ed.note: I’ll have some comments on this issue tomorrow.–Ed

Corporate communications notwithstanding, I think ATI’s new adaptive filtering method is a good thing, a positive, not a cheat. Until such time as someone shows instances where the algorithm visibly fails, I cannot wait to use such tech this year ( if the price is right ).

If it does fail in certain software/situations though, then as with nVidia’s brilinear filtering, I will demand like others that ATI provide a ‘turn it off and turn full tri’ option.

This would allow anyone to chose the older ‘full scene’ method in instances where it looks better and/or performance loss isn’t an issue . I believe that’s a reasonable approach for all cards and all gamers. But for others, they either just have too much time on their hands or too many axes to grind.

I’m going now to put on my asbestos suit and await replies.

I expect people to differ, argue and even flame. In all areas, especially technical ones, I welcome any criticism or better yet education.

This is not necessarily an official view of Overclockers.com.

Cowboy X

Be the first to comment

Leave a Reply