• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

ATi X800 Pro vs. nVidia geForce FX 6800GT

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
im just trying to figure out where he is going with this..

there is almost no visual difference between the two cards, can we agree on that? most if not all websites come up with this conclusion..

Now on to the optimized trilinear deal you are stuck on.. I ran 3dmark for you with certified drivers and got with in 100pts of my previous score with it enabled.. the card runs great and I could care less if said card beats it by 5-10fps, who the **** cares.. , they are both very good this generation and there is really nothing to complain about... if your going to call people fan boys why dont you take a step back and think why are you fighting this so hard..
 
also like I've and others have said..

in these reviews that put both ATI and Nvidia even in graphical quality are doing so with Nvidia using these optimized Trilinear settings right?? ATI is using Quality the top they can do right? but at a preformace cost?
The Optimized settings are doing what then? making the graphics sub-par to increase fps?? so these stupid reviewers that keep leaving it enabled are coming up with the conclusion that the two have equal graphical quality and showing us side by side images to prove their point.. so why are the optimizations bad when you cannot tell the difference??

or am I missing your argument all together..
 
micamica1217 said:
now you understand why I call that site...nVidia biased.
next thing you'll hear from them is that the 6800u has better 2D "desktop" IQ...LOL

what they are realy saying is that they just ran both cards at default settings, and didn't change anything but vsync.

mica
All sites have a bias really, but only xbit seems to have a problem with wording. Also, seems like the anandtech graphs didn't change much. Half show ATI winning, half show NVidia leading.


your dealing with folks who have nVidia cards in thier sig's and 6800's on order...Id give up. If the XT win on the, obviously biased towards nVidia, xbitlabs article(part 2) I linked above didnt convince them of the X800's ability under max AA and AF (quality visuals on max), I doubt you'll change thier minds. I mean seriously, even after it was made clear that the 9800 Pro and XT were a far superior card than the 5900 series, in every way, people still bought them for more than 9800 Pro's....blame it on brand loyalty I guess
Ah yes, the old "Look what cards they have in their sig, while I have ATI in mine" post. Ironic I'd say....

I might be wrong but I believe the ATi cards all default to "Quality" meaning Trilinear is on by default. With the nVidia 6800's I believ its off by default. So when a reviewer write that they set everything to default, its not apples to apples...so to speak. I could be wrong though
ATI only turns on trilinear when colored mip-maps are run. So techically, I wouldn't called that default. Sound they force ATI to run trilinear all the time? Oh wait... you can't, no option in the control panel...

also like I've and others have said..

in these reviews that put both ATI and Nvidia even in graphical quality are doing so with Nvidia using these optimized Trilinear settings right?? ATI is using Quality the top they can do right? but at a preformace cost?
The Optimized settings are doing what then? making the graphics sub-par to increase fps?? so these stupid reviewers that keep leaving it enabled are coming up with the conclusion that the two have equal graphical quality and showing us side by side images to prove their point.. so why are the optimizations bad when you cannot tell the difference??

or am I missing your argument all together..
That's what I keep wondering... if their's no IQ differences... who cares? If we rid cards of all optimizations, then we'd have to create our own video card drivers and games, because games optimize for certain cards as well.
 
Last edited:
Dragonprince said:
I might be wrong but I believe the ATi cards all default to "Quality" meaning Trilinear is on by default. With the nVidia 6800's I believ its off by default. So when a reviewer write that they set everything to default, its not apples to apples...so to speak. I could be wrong though :D
I don't know what the driver defaults are.
nVidia has options for either full or angle-dependant AF, and full or adaptive trilinear ("brilinear"). Both are set to full with the High Quality setting, while the latter can also be set seperately with a checkbox somewhere.
ATi has angle-dependant AF and adaptive trilinear. Neither can be turned off (barring a registry hack, but I haven't heard of anyone figuring out a way to do it).
Regarding IQ, I highly doubt any of the optimizations would be noticeable in actual gameplay* (ie without coloring mipmaps, using 10x zoom, etc.). Go here, for example. Clicking on the images opens a lossless PNG version. Open those in windows/tabs side by side in the exact same position, then switch between them. There is no difference.

* Especially since ATi has been using it since the 9600s, and no one noticed :). Also, the original incarnation of nVidia's brilinear *was* noticeable, and was discovered by means of "...why does ut2003 look strange?". From what I've heard, it has been vastly improved since then.
 
ATI only turns on trilinear when colored mip-maps are run. So techically, I wouldn't called that default. Sound they force ATI to run trilinear all the time? Oh wait... you can't, no option in the control panel...

I think this is what that review somebody quoted was talking about when it said 'nvidia optimisations were turned on to match the ATI 'quality' setting'. It would hardly be fair to turn the nvidia optimisations off when it is not possible to do so for the ATI version. If the nvidia optimisations 'look worse', then it becomes an image quality discussion, not a performance one.

I could well be wrong, but it seems to me that at this stage, there is no way to do a test where both ATI and nvidia cards are running the exact same filtering/AA, because it is not possible to turn off the ATI optimisation, and the nvidia optimisations are different to the ATI ones. Is this correct?

Every review I read concerning the new PS3 patch for FarCry came out on balance putting ATI and nvidia about even. The situation between ATI and nvidia 'fans' right now seems very similar to the situation between AMD and intel 'fans' - AMD/ATI people always seem to me to want to try and make out that the competing product is horribly worse than the one they like, and go as far as to think anybody who buys the alternative product stupid! I think the important thing to remember is that overall, the situation is fairly close, and that everybody will prefer one product or another for their own reasons. If you like a different one best, that's great and up to you, but it doesn't seem very productive to me to try and convince people that they are blind, stupid, wasting money, a 'noob', or are being sucked into 'corporate lies', just because they happen to prefer a different product than you do. If it was so clear cut, reviews would be concluding that nvidia has made a huge mistake and totally screwed up their new product line.

Why can't the 'fans' make like the reviewers, and be happy that we have two closely matched product lines? Surely everybody can see that this is a good thing for the market... [when all the stuff finally gets to market :)]
 
Illissius said:
I don't know what the driver defaults are.
nVidia has options for either full or angle-dependant AF, and full or adaptive trilinear ("brilinear"). Both are set to full with the High Quality setting, while the latter can also be set seperately with a checkbox somewhere.
ATi has angle-dependant AF and adaptive trilinear. Neither can be turned off (barring a registry hack, but I haven't heard of anyone figuring out a way to do it).
Regarding IQ, I highly doubt any of the optimizations would be noticeable in actual gameplay* (ie without coloring mipmaps, using 10x zoom, etc.). Go here, for example. Clicking on the images opens a lossless PNG version. Open those in windows/tabs side by side in the exact same position, then switch between them. There is no difference.

* Especially since ATi has been using it since the 9600s, and no one noticed :). Also, the original incarnation of nVidia's brilinear *was* noticeable, and was discovered by means of "...why does ut2003 look strange?". From what I've heard, it has been vastly improved since then.

Thanks for the link. That's the best and least biased review and explanation yet. :thup:
 
Illissius said:
I don't know what the driver defaults are.
nVidia has options for either full or angle-dependant AF, and full or adaptive trilinear ("brilinear"). Both are set to full with the High Quality setting, while the latter can also be set seperately with a checkbox somewhere.
ATi has angle-dependant AF and adaptive trilinear. Neither can be turned off (barring a registry hack, but I haven't heard of anyone figuring out a way to do it).
Regarding IQ, I highly doubt any of the optimizations would be noticeable in actual gameplay* (ie without coloring mipmaps, using 10x zoom, etc.). Go here, for example. Clicking on the images opens a lossless PNG version. Open those in windows/tabs side by side in the exact same position, then switch between them. There is no difference.

* Especially since ATi has been using it since the 9600s, and no one noticed :). Also, the original incarnation of nVidia's brilinear *was* noticeable, and was discovered by means of "...why does ut2003 look strange?". From what I've heard, it has been vastly improved since then.

thanks for the link....
here's a quote from the next page:

the Tech Report's texture filtering test said:
Here are NVIDIA's trilinear optimizations at work. Mip-map boundary transitions aren't as smooth as they are on the Radeon X800 XT and on the GeForce 6800 Ultra with "brilinear" disabled. Notice the odd mixture of filtering going on in the 61.11 drivers with optimizations disabled (edit by mica: they realy meant=optimizations enabled..typo's, lol). The floor and angled surface look like they should, but the wall's mip map boundaries show the dark banding indicative of the "brilinear" method

anyway, I agree with scary_jeff, when he says:
I think the important thing to remember is that overall, the situation is fairly close, and that everybody will prefer one product or another for their own reasons. If you like a different one best, that's great and up to you, but it doesn't seem very productive to me to try and convince people that they are blind, stupid, wasting money, a 'noob', or are being sucked into 'corporate lies', just because they happen to prefer a different product than you do. If it was so clear cut, reviews would be concluding that nvidia has made a huge mistake and totally screwed up their new product line.

I even think the FX5900 cards are great.

yet I don't agree with the FUD that Pake is trying to spread:
ATI only turns on trilinear when colored mip-maps are run. So techically, I wouldn't called that default. Sound they force ATI to run trilinear all the time? Oh wait... you can't, no option in the control panel...

Pake is well aware that ATI doesn't do brilinear filtering at any time.
he is well aware that the cards do bilinear filtering when set at "performance" and FULL trylinear when running at "quality".

a good read on filtering is located here....LOOK.

I'm sorry that Pake might have misleaded scary_jeff into thinking that ATI doesn't always do full trilinear....we tryed to teach him.

mica
 
It's the same crap, only different wording. All I'm saying is that NVidia allows you to turn stuff off, but ATI doesn't, so how can you call it unfair to force NVidia to use no optimizations but say it's ok for ATI.

What 'trylinear' filtering does is determine when and where full trilinear filtering is and isn't needed, and thus only uses bilinear filtering on parts of the scene that it is decided doesn't need full trilinear
No matter what you think, this is an optimization. You say that NVidia should have their optimizations off, then so should ATI. If NVidia has to run full trilinear throughout their whole frame all the time, then ATI should be as well, meaning not using "trylinear" or "brilinear". This is where we collide in belief. I feel that if ATI is allowed to use their optimization b/c it's not possible to turn off, then so should NVidia, as long as the same IQ is present, in which it is from how most reviewers are talking.

Either turn off both NVidia's brilinear and ATI's trylinear or leave them on. Otherwise, who cares, the same IQ is being accomplished.

So pick what you want:
1) IQ comparisons with benchmarks, regardless of optimizations.
2) All optimizations off, brilinear or trylinear, who cares. Neither card should have an optimization running, no matter how little of a difference it makes.
3) Only turn off NVidias.

I'll choose option 1. What's your choice? 3?
 
Pake said:
It's the same crap, only different wording. All I'm saying is that NVidia allows you to turn stuff off, but ATI doesn't, so how can you call it unfair to force NVidia to use no optimizations but say it's ok for ATI.


No matter what you think, this is an optimization. You say that NVidia should have their optimizations off, then so should ATI. If NVidia has to run full trilinear throughout their whole frame all the time, then ATI should be as well, meaning not using "trylinear" or "brilinear". This is where we collide in belief. I feel that if ATI is allowed to use their optimization b/c it's not possible to turn off, then so should NVidia, as long as the same IQ is present, in which it is from how most reviewers are talking.

Either turn off both NVidia's brilinear and ATI's trylinear or leave them on. Otherwise, who cares, the same IQ is being accomplished.

So pick what you want:
1) IQ comparisons with benchmarks, regardless of optimizations.
2) All optimizations off, brilinear or trylinear, who cares. Neither card should have an optimization running, no matter how little of a difference it makes.
3) Only turn off NVidias.

I'll choose option 1. What's your choice? 3?

what you fail to realize is that ATI's trilinear filtering on the x800 cards is NOT an optimization.
I repeat, it is NOT an optimization.
it is a differant aligorithum then what they used in the 9800 cards.
the NV40 uses a differant aligorithum then say the 4200 cards, and the 4200's use a slightly diff aligorithum then the GF3 ti cards.
this is NOT an optimization.
it does NOT show MIP MAP BOUNDERIES like when the 6800 cards use "briliner" filtering.

the 6800 cards can do full triliner filtering, showing no mip map bounderies when it is run on "off"......
an option that the 5900 cards did not have.

there is not one game today that shows mip map bounderies with the ATI x800 cards.

please stop speading FUD.

thank you.

mica
 
It's only not an optimization to you. Not only that, but you really can't even see the mip-map boundaries on the 6800s. A little more tweaking and it'll be just right. So your choice is "3", which means it's basically unfair to let NVidia cards use optimizations when ATI cards can have their's on.
 
Pake said:
It's only not an optimization to you. Not only that, but you really can't even see the mip-map boundaries on the 6800s. A little more tweaking and it'll be just right. So your choice is "3", which means it's basically unfair to let NVidia cards use optimizations when ATI cards can have their's on.

nope sorry I think you have things confused...
it's only an optimization to you.

everyone in the whole world knows that ATI doesn't do any kind of optimizing on trilinear.
yet nVidia still does when turned on.

I'm sorry that we disagree, yet please don't spead rumors or misleading info when you can't show any briliner filtering on the x800 cards.
infact, nobody has...because it's not an option with the x800 cards.

Oh, and please don't speak for me...I'll make my own choice, thank you.

mica
 
Let's take a poll then.

Who here thinks "trylinear" is an optimization?

Keep in mind what optimization means: http://dictionary.reference.com/search?q=optimization

BTW, where's the brilinear proof on the 6800 cards? I'm not seeing any differences in image quality... Not only that, but how about we discard all reviews thus far and say "Until we can FORCE trilinear (meaning no bri or try), no card review should even be considered a fair review." I'd say this would be the most fair review you could get, because it removes all questionable things.
 
Last edited:
Ok Mica looking at the article Illisuis posted the optimazations you are screaming about don't show up noticebly in regular gameplay very often if at all. Yes when you turn on different colors for different filters you can see differenences. However if anyone here plays games with options like that on(ie the different colors for different filters) I must ask why do play games at all. I play them for personal enjoyment so that means if I get bored then ya I might try using FRAPS or crank up settings as far as they will go for awhile. Yes I understand people buy these cards for absolute maxes while playing but hey if it looks the same and gets somewhat better FPS then why not.

Finally did anyone else who read the tech report article did they get something from their anti spyware software your computer trying to load something called Advance A inc?
 
this thread is getting grosly off topic...
it was suposed to be about what card to get, and could he see some benchies from real world users.

as for ATI filtering, it has been discused to death.
here is one of my many quotes from one of the links below:

micamica1217 said:
what I'd like to add is:

why are some people putting triliner filtering on such a high pedestal???

the good and bad of common filtering:

bilinear- sharper textures then point sampling, yet has mip map transitions are the bad points.

trilinear-mip map transitions are gone, yet texture edges are now softer or unsharp...lowing IQ of the texture in order to avoid banding.

brilinear- a turm used to discribe what nVidia is displaying when triliner is asked for.
it's better then full biliner, yet it still has mipmap transitions...looking like bilinear.
I must say that they are doing better in the briliner filtering then when they tested for benchmarks.

ati's "smart trilinear" or adaptive trilinear: mip map transitions are still gone (meaning full triliner), yet some texture edges are no longer soft or unsharp.
giving improved IQ (if anything) if you are willing to enlarge the image by 80times.
most likely, you'll notice no improvment to IQ while playing any game.

I can't see why anyone would call an improvement in tech a cheat....
what is going to happen when triliner is obsolete?

better yet, why would anyone want to stick with the old filtering methods?

some realy good reading....

ONE

TWO

THREE

FOUR

I have to agree with speed bump, as I've said in the past, nVidia's brilinear filtering "IS" getting better...
yet mip bounderies are still noticeable from time to time.

to be honest, brilinear doesn't bother me that much.
it's better then my hatred for jaggies.

anyway, I hope that more peeps who have the newer cards start showing up and posting thier benchies/thoughts/ideas.

mica
 
Last edited:
Ok, back on topic: Get a 6800 Ultra, 6800 GT, or X800XT. Avoid the X800 Pro, it simply is not a big enough jump in performance. If the cost of the X800 Pro was lower, then it would be a viable card as well. No flames please, it's just my opinion.

My 6800 Ultra is awsome and I can highly recommend one! :)
 
Well despite being a hardcore Nvdiot I will have to disagree about the x800pro, it is a huge leap over the previous gen cards and from what I see depending on what you are doing with it it is just as good as the GT and sometimes better.
 
scott_55X said:
Ok, back on topic: Get a 6800 Ultra, 6800 GT, or X800XT. Avoid the X800 Pro, it simply is not a big enough jump in performance. If the cost of the X800 Pro was lower, then it would be a viable card as well. No flames please, it's just my opinion.

My 6800 Ultra is awsome and I can highly recommend one! :)



As said above the x800PRO has been ahead of the 6800 at times and is $100 cheaper @ 399 SMRP - so how can that not be worth it.....


:D

Just my opinion
 
micamica1217 said:
anyway, I hope that more peeps who have the newer cards start showing up and posting thier benchies/thoughts/ideas.

mica


*Hopefully* in the next week i will be buying a x800 PRO - i was going to go XT but i figure i will save that buy for when i get the funds for my dual opteron and will go with the PCIe flavor - or am now considering the the NVIDIA SLI route :D


but we will see - once i get this card it will be running in a system with these specs


Asus P4P800
P4 2.6C (once it arrives)
512 pc3200 to start (order better ram when i see how far the cpu will O/C)


Will be happy to bench what ever people like - time permiting of course :D
 
The x800 series are infact powerful cards, but they are not quite as future-proof as the 6800 series. The fact of the matter is that the x800 lineup is more of a revised 9800 lineup (though it is indeed fast), but the NV40 is based on completely new and independent architecture. I'd say this is the one thing that makes NVIDIA's new lineup so promising. As I stated in a previous post, PS 3.0 is indeed a really as it is making it's way into some of the newest and most anticipated games this year.

Also, the sad truth for some is that both NVIDIA and ATI use optimizations for their cards in some way or another. This is not opinion but fact, and no color screenshot will change my view on this. I won't pretend to be a fanboy of either as I own several video cards from both companies, but it really strikes me how some people attack NVIDIA for it's drivers/optimizations. Granted, their methods may not be as evident as others, but NVIDIA has made no egregious errors. Remember quack.exe? That was ATI. I'm not saying this to try to sway people from ATI to NVIDIA or vice versa, but I would like to see people take a more objective appraoch to these types of issues, wether it be AMD or Intel, or, in this case, ATI and NVIDIA. There's more I'd like to type but I'm tired as f*** so in the mean time...


deception``
 
wow lol i remember when i noticed that the benchmarks for both lineups of cards had been posted on many sites, and i went through and read several of them (anandtech, tomshardware, hardocp) out of those, i preferred the hardocp and tomshardware benches, because andand seemed to be getting very different results than the other two, and it didnt make any sense. however, after i did that, i logged onto the overclockers forums to see what the debate was saying about which card was better (as i really wasnt sure on my own) and, well, no one else was either! now, months later, people still arent sure.

Personally, i want to see a couple of things before i make a decision whether or not to go out and buy one of these, or even decide which is definitively better.

i am very interested in the various SLI technologies coming out. nvidia and alienware (the alienware one is useless to me by itself, but other motherboard manufacturers may incorporate something similar in their boards. imitation is the highest form of flattery)

next, i want to see how both card lineups overclock, and look at the performance then. This IS Overclockers Forums, not "Joe Sixpack Forums" after all.

i want to see if ati will produce anything to counter nvidia's SLI, because if they dont, things are going to get very interesting indeed in the enthusiast arenas. (perhaps i should say, they'll get far less interesting, and we'll have a clear cut winner)

last, i want to see how much of a difference SM3.0 makes. i know that some future games are using it, such as the aforementioned farcry and s.t.a.l.k.e.r. and, unreal engine 3. (which, if you've seen the techdemos, looks pretty impressive) it reminds me of the doom3 thing, though. people saw how doom3 ran on their 9700's compared to previous cards and just had to have them. (unreal engine 3 is much hyped as paired with nvidia)

Personally, im not interested in SLI because i can get insane performance now (now being near future, once all the pieces solidify in the channels) but because i can buy one card now, like the 6800 or GT, and another one when it starts to look like i need a performance boost. (no one is going to tell me that those cards are too slow for the current environment) that would probably be a year to a year and a half down the road, and the prices would have come down a lot on the cards, so it would be a much cheaper upgrade.

oi, that was a lot longer than i had originally planned, and im not exactly sure how on-topic it all is, but i just wanted to add my own personal angle to the discussion.
 
Back