• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

and the IQ is differant again on 3dmark05

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
JeffP said:
Mica just post your screen shots of the same pics as Sentential already. Your beating around the bush or issue more then John Kerry. :clap:

I have clearly posted IQ pics with the 6800 and the x800 using the same frame number at both the default test, and at 4xAA and 8xAF.
both IQ test pics are clearly labled and marked by an independant source so no fudging on my part can be made.

if you fail to see them (and they are only three posts above your reply), then clearly you will not notice the differances in IQ. :p j/k


since the beginning of this post I have clearly given reasons why we can no longer use this test as an "apples to apples" comparison across all cards.
the reasons have been backed up by pics from web sites as well as sent's IQ pics with the shadows.

BTW,
never under estamate the power of the mica when I do an investigation like this....I already know the results and can back them up. :D

mica
 
From the pics above it appears more like one image is brighter then the other. Easy contrast change. Take the original image from ATI throw a little more brightness and contrast and you will see they are nearly identical to the Nvidia pic.

What I was really wanting to see is if the X800 looks as horrible as the 6800GT in 3dmark05. My guess is yes it does. I was hoping you would provide some full size pics like Sent. did just for comparision sake. I don't care if they are the same frame or not as long as it's close.

I think that leaves everyone the opportunity to draw their own judgement and hopefully no doctoring of the pictures.
 
JeffP said:
From the pics above it appears more like one image is brighter then the other. Easy contrast change. Take the original image from ATI throw a little more brightness and contrast and you will see they are nearly identical to the Nvidia pic.

What I was really wanting to see is if the X800 looks as horrible as the 6800GT in 3dmark05. My guess is yes it does. I was hoping you would provide some full size pics like Sent. did just for comparision sake. I don't care if they are the same frame or not as long as it's close.

I think that leaves everyone the opportunity to draw their own judgement and hopefully no doctoring of the pictures.

I have relooked at the IQ pics from the site I used as a comparison, and find no brightness or contrast change or differance between the two (when viewed at full size 1024x300. the size of all three pics is 1024x900).

the bots, boxes, door, bridge, pipes and poles are all the same in brightness and contrast.....
one of the bots arms does look slightly lighter, but is such a small differance, that maybe I'm just trying to find something that is differant in color, contrast, or brightness.

THERE ARE CHANGES IN THE HIGHLIGHTS....
such as on that beam that holds up the vat.
it's brighter, almost "over bright" on the nVidia pic.
(if 16bit PS3.0 or partial presition was used, then that explains it.)
this has everything to do with SM3.0 vs SM2.0 in this test, and I'll leave it up to you to deside what you like better.

now if the nVidia image was truly brighter and more contrasty, two things will happen.
first when making it brighter, all parts of the image would start to be "washed out"....that is clearly not the case.
next, when adding contrast to get the blacks from the washed out pic, black again, you'll lose more detail in the high lights, but also in the other areas as well.
you'll lose tonal value when adding contrast, and I don't see this loss.
the nVidia image at 1024x300 (the same pixel width as sentanal, as only the lower part of the pic is cut off...do we really need to see more of the blackness like in sent's pics?), shows great "tonal" detail throughout the whole scene.
only the "high lights" seem brighter to me....IMHO

this doesn't change the fact that in the default test the nVidia pic, like with sent's pic shows DTS shadows (or lower quality shadows), and texture loss.
as well as no FSAA in the high quality tests.

I have to agree that once you start playing at 1280x1024 and 4xAA and 8xAF in basicly all your games, then look at a benchmark at 1024x768 and no AA and no AF....what we are seeing sucks.

sorry, I have no hosting for large files and any pics I post from me will be far too small.

mica
 
Sentential said:
Yea mica I must say that Im quite pissed off about this shader issue. It gets even worse on "Performance"

Bear with me Ill get a screenie of it.


we have discussed this, the 6800s arent bad cards, they can beat the x800s in some games but you pay for it with worse IQ and thats a NO NO for me. I just saw how jagged your shadow is in farcry :( also nvidia cheats in 3dmark 2005 and STILL cant beat the x800s! ps3 is not a really big deal, ati runs ps2 and still ahead. ps2 and ps3 are the same, ps2 can do anything ps3 can with more passes but ati can do ps2 fast. fx cards suck so bad, ive been telling everyone to get a 9700 pro instead over any fx5900 period, now they are paying for it when my little softmodded 9500np at near 9800 pro speeds will be like 50% faster than their fx5900 ultra LOL.


I am gonna stick with my 9500np for the time being and save up for one of the following: 9800pro(must overclock godly) 6800nu, 6800gt or x800pro. I might wait till I have a winchester first since my cpus gonna bottleneck those cards
 
There are people who host pics in the classifieds that may be able to help you with hosting screenshots. I would like to see some as well out of curiosity.

Down with Futuremark on the optimizations though. Can they release a 3Dmark program without messing up or the card manufacturers messing up? You'd think SOMEONE would learn after a while.
 
I agree. I will tell you that the website also says Forceware 66.51 drivers are the only "approved" by Futermark....they suck. The IQ difference between the 61.77's and the 66.51's is huge, in a bad way. The 61.77's gave me a 4333 score, the 66.51's gave me just over 5k, I can see the missing textures while 05 runs with the 66.51's....I cant believe Futuremark would sanction something so blatantly bogus.....
 
Dragonprince said:
I agree. I will tell you that the website also says Forceware 66.51 drivers are the only "approved" by Futermark....they suck. The IQ difference between the 61.77's and the 66.51's is huge, in a bad way. The 61.77's gave me a 4333 score, the 66.51's gave me just over 5k, I can see the missing textures while 05 runs with the 66.51's....I cant believe Futuremark would sanction something so blatantly bogus.....

futuremark showed signs of sanity only in the first release of 3dmark 2k3, where no iq-crippled nvidia drives were allowed... but then misteriously a patch was released which allowed them...

now this, a default nvidia biased bench that still doesnt help lol... futuremark are a bunch publicly and wide open of sellouts, we gamer comunity shouldn use their benches anymore... but so many companies back them up i rarely see it happening.
/linux fanboy mode on
perhaps a gnu benchmark will come
/linux fanboy mode off

ROFL
 
the more I read about nvidia cheats and bad iq, the less I care for them, so what if there faster in doom3, ati would be faster if nvidia didnt cheat or if ati did. I bet a 9800xt would be equal or better than a 6800nu if nvidia didnt cheat
 
Its still a very good card. Either way it doesnt bother me. Its still a hellova upgrade from my 9800PRO, and I sincererly doubt any game will challenge it for atleast a year.

As for the "overbright"-ness, thats due to their "digital-vibrance". I personally like it. It gives games a surreal look and drastically helps with visual clarity.
 
micamica1217 said:
this doesn't change the fact that in the default test the nVidia pic, like with sent's pic shows DTS shadows (or lower quality shadows), and texture loss.
as well as no FSAA in the high quality tests.

mica, can you do a run through 05 again, and look close at the shadows in the blimp test? i wanna know if you are seeing what i am...

on my 9700pro, i get the EXACT same jaggy shadows. they are sorse from some angles than others, but if you look close you can see them throughout.
 
hUMANbEATbOX said:
mica, can you do a run through 05 again, and look close at the shadows in the blimp test? i wanna know if you are seeing what i am...

on my 9700pro, i get the EXACT same jaggy shadows. they are sorse from some angles than others, but if you look close you can see them throughout.

I just reran the test twice...at default and at, 4xAA and 8xAF

yes, like in sent's pic of the blimp, there is jaggies on the shadows from the ropes.
yet the ropes are not smooth as you can see, so I'm not suprised.

yet the rest of my shadows are far smother and almost jaggy free at default.
once I used 4xAA, the the harsh shadows from stright line forms are smooth as silk.
soft shadows even at default were always smooth, so that's not a big deal.

again, look at the door in the "return to proxy" pics I posted up....at 4xAA and 8xAF
you'll see that the nVidia pics are never soft, and are full of jaggies.
that's the diff with DTS on or off.

sorry for not posting about the ropes being jagged from the start, as I thought it was well known.

(I'm also not sure just how well AA and AF is working in sent's pics, as I'm hearing that forcing AA and AF in the CP with nVidia cards don't work.
I also said that I noticed far to many jaggies in sent's pics, yet I'll have to look more into this when I have time.)

mica
 
Last edited:
Well the DST differences you pointed out can be disabled.

But while you are on 3dmark05 differences, check out some of the IQ problems with 3dmark05 and ATI cards:

3DMARK05 ATI X800 ARTIFACTS: FOG RENDERING

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREA IN THE RED OUTLINE

NOTE the failure to render all of the fog in the ATI screenshot

EXAMPLE 1
Nvidia GeForce 6800 Ultra 66.70 Beta
ATI X800XTPE 4.9
Reference Rasterizer image

EXAMPLE 2
Nvidia GeForce 6800 Ultra 66.81 Beta
ATI X800XTPE 8.07
Reference Rasterizer image

Source: Beyond3D.com & NvNews.net forums

*********

3DMARK05 ATI X800 ARTIFACTS: FILTERING/MOIRE

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREAS IN THE RED OUTLINES

NOTE the obvious moire patterns in the ATI screenshots
Nvidia GeForce 6800 Ultra 66.51 Beta Application AF
ATI X800XTPE 4.9 w/ hotfix Application AF

Source: Bjorn3d.com

********

3DMARK05 ATI X800 ARTIFACTS: BANDING

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREA IN THE RED OUTLINE

NOTE the increased banding in the sky in the ATI screenshots
Nvidia GeForce 6800 Ultra 66.51 Beta
ATI X800XTPE 8.07 Beta
Closeup of ATI's banding from above shot

Source: Driverheaven.net
 
I was following that stuff over on Beyond3d , but while ATI's issues are more of a driver bug(s) , the reason of this thread was to discuss the somewhat unfair practice of allowing DST for Nvidia but not 3Dc for ATI . And therefore the condoning by Futuremark of Nvidia's 'lower IQ increase framerate' tactics .

But as I said I will be following the whole decreased fog issue etc with ATI in 3DMark 2005 .
 
Cowboy X said:
I was following that stuff over on Beyond3d , but while ATI's issues are more of a driver bug(s) , the reason of this thread was to discuss the somewhat unfair practice of allowing DST for Nvidia but not 3Dc for ATI . And therefore the condoning by Futuremark of Nvidia's 'lower IQ increase framerate' tactics .

But as I said I will be following the whole decreased fog issue etc with ATI in 3DMark 2005 .

Don't really know how "unfair" you can call it. See any support for Nvidia's OpenEXR HDR lighting? How about for using the extensive SM3.0 featureset with stuff like dynamic flow control? Nope, 3dmark05 seems to be written with SM2.0 in mind and then ported over to SM2.0b/Sm3.0 with little code rewrite based on the benchmarks. In other words, its not taking advantage of Nvidia hardware either.
 
So basically 3dmark 05 is a piece of **** that probably allows both companies to cheat on thier benchmarks? This is why I dont like aritificial benchmarks...

Also: What does this mean for the mid-range cards? Seeing as this would be the meat of the market, I would think both companies would have the greatest incentive to cheat with these... Does all this stuff you guys are talking about apply to them? (i.e. X700 and 6600)
 
tranCendenZ said:
Don't really know how "unfair" you can call it. See any support for Nvidia's OpenEXR HDR lighting? How about for using the extensive SM3.0 featureset with stuff like dynamic flow control? Nope, 3dmark05 seems to be written with SM2.0 in mind and then ported over to SM2.0b/Sm3.0 with little code rewrite based on the benchmarks. In other words, its not taking advantage of Nvidia hardware either.

Is that not the way of all 3DMark versions ??? For example 2001SE has more DX7 than DX8 by far and 2003 is mainly DX8 and even includes a DX7 gametest but hardly any Dx9 . And therefore it is no small wonder that this new 3Dmark is mainly plain SM2.0 and very little 2.0b or 3.0 .

What the issue is here is a situation where a specific ATI feature was blocked beecause it isn't part of the DX9.0 spec ( 3DC ) now I consider that to be reasonable and sensible . ATI would however argue that it is a DX9+ or DX10 feature , and Microsoft has agreed to that and will include it in future releases . But 3Dc was prevented from being part of this DX9 benchmark . Again I see nothing wrong with that at all .

But then how is it good for Nvidia's DST to be included in the benchmark . DST is not a DX9 feature and it appears that it never will be ! Yet this Nvidia performance enhancing , (also IQ lowering) feature has been allowed by Futuremark .

That is the point that I am making .

The issue is not about why certain Nvidia official DX9c features aren't present ( if they include them , no problem , even though the NV4x is currently to poor at several to run them properly ), since this is typically Futuremark's way of doing things, but rather that a non DX feature (DST) has been included .

Finally responding to the initial premise of this thread with areas where ATI or Futuremark or ATI's drivers have fallen down is not relevant , since that still doesn't deal with the issue at hand ....... the inclusion of DST . The info you posted would however do very well as a seperate thread since it is something that can generate good discussion and is a situation which needs monitoring . But in this thread it just fuels the already bad ATI vs Nvidia fanwars .
 
tranCendenZ said:
Well the DST differences you pointed out can be disabled.

But while you are on 3dmark05 differences, check out some of the IQ problems with 3dmark05 and ATI cards:

3DMARK05 ATI X800 ARTIFACTS: FOG RENDERING

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREA IN THE RED OUTLINE

NOTE the failure to render all of the fog in the ATI screenshot

EXAMPLE 1
Nvidia GeForce 6800 Ultra 66.70 Beta
ATI X800XTPE 4.9
Reference Rasterizer image

EXAMPLE 2
Nvidia GeForce 6800 Ultra 66.81 Beta
ATI X800XTPE 8.07
Reference Rasterizer image

Source: Beyond3D.com & NvNews.net forums

*********

3DMARK05 ATI X800 ARTIFACTS: FILTERING/MOIRE

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREAS IN THE RED OUTLINES

NOTE the obvious moire patterns in the ATI screenshots
Nvidia GeForce 6800 Ultra 66.51 Beta Application AF
ATI X800XTPE 4.9 w/ hotfix Application AF

Source: Bjorn3d.com

********

3DMARK05 ATI X800 ARTIFACTS: BANDING

*** PLEASE OPEN THESE TWO IMAGES IN SEPERATE BROWSER WINDOWS AND SWITCH BACK AND FORTH USING ALT TAB AND EXAMINE THE AREA IN THE RED OUTLINE

NOTE the increased banding in the sky in the ATI screenshots
Nvidia GeForce 6800 Ultra 66.51 Beta
ATI X800XTPE 8.07 Beta
Closeup of ATI's banding from above shot

Source: Driverheaven.net


when your going to post facts about this test, please include ALL the FACTS.

like how there is not one driver for any IHV at this time, that is official for anything but the default tests.
how about that all the test pics you show are using the app and not the CP of the cards....
btw, when using the CP for ATI cards, all banding and IQ problems are gone.
(I'm still wondering about the fog issue as I don't seem to have this problem...
and it looks like the fud that was started about aquamark bench, and I don't have problems with missing smoke in that as well.)

then why don't you talk about how even microsoft says you will never get an image the same as the rasterizer on any card in all parts...it's just not possable with all the filtering going on with the cards anyway.

then there is your thoughts on DTS....
since DTS is not now, or ever will be part of DX9.0 in any way, shape, or form, because it is a vender specific extension and not part of SM2.0 or SM3.0

so in turn, because of DTS, 3dmark05 is no longer a DX9.0 benchmark, and is now a game benchmark that does not use any game engine....lame.
---------------------------

the reason I've even done this investgation thread is because I've seen tons of threads on this topic such as nVnews, bjorn3d, and others.
many of them are not investigating anything, but spouting gosip and rumors.
even if they have been proven wrong.
btw, bjorn3d has already shown that they didn't use the same AA setting on the ATI and nVidia cards....realy lame. :-/

you also failed to note that the author of bjorn3d could not get 3dmark05 to work with AA when set in the CP, so he thought it was the same way for ATI.
(again, lower IQ results for both cards will result when done in the GUI instead of the CP in this tests...again, isn't that crazy?)

you also don't note that not one person could find any more banding in the sky pic with a ATI card, as compared to nVidia...both have realy, realy, realy, small amounts of banding...just in differant spots.
(again, you'll get no banding if done in the CP of the card(s)...and no driver is verifide for anything but the default test.)

I will repeat my self, nVidia is not cheating in this test due to DTS...this is a problem that futurmark needs to anser to if you ask me.

I could go on and on, but at least you show where you got your information from...(or lack of it.)
I would post links, or quotes from the threads from them forums, but it would be a waste of time....and anyone here can easily look it up to see how truthfull I've been.

I agree with cowboyX, you could start another thread, as it seems you just want to start something, and not post all the facts in order to help others.


but I at least understand why futurmark wants to use DTS as a feature...
you see, many games that will come out will use it.
in fact, Farcry uses it too.

my.php


my.php


my.php


my.php


guess what pics are ATI and what are nVidia.

oooopps,
BTW, WELCOME TO THE FORUMS. :D

mica
 
Last edited:
Maybe I'm pointing out the obvious but the only people I see complaing and whining is ATI folks. I've never seen so much well you got this and I don't crap since kindergarden and believe me that was a long time ago.

Maybe ATI should invent some new technologies that enhacne the experience. If a new feature is incorporated and can be used by game developers then why not incorporate it in a 3d benchmark, thats very real world and pratical.
 
JeffP said:
Maybe I'm pointing out the obvious but the only people I see complaing and whining is ATI folks. I've never seen so much well you got this and I don't crap since kindergarden and believe me that was a long time ago.

Maybe ATI should invent some new technologies that enhacne the experience. If a new feature is incorporated and can be used by game developers then why not incorporate it in a 3d benchmark, thats very real world and pratical.

maybe you missed it....

futuremark says that 3Dc will not be added because it is not part of DX9.0!!!!
DTS is not part of DX9.0 as well....!!!

shadow IQ has now been reduced to crap....(not what most people bought a 6800gt/u for if you ask me.)
(if all you want is framerate, then why even buy a new high end card?)

now care to comment?

mica
 
mica I kept it nice and simple. Is the ATI camp going to cry and ***** everytime a new game is released that has DTS or OpenGL or whatever other unique features that enhace the experience for Nividia owners and not ATI? I just don't see nearly as much complaining from the Nvidia guys, maybe thats because they are to busy actually using their cards in games. :bday:

I think people get way to wrapped up in the benchmark results and just look for something wrong. Who cares really, do you enjoy your ATI and the game play it offers if so whats the fuss. :beer:

3dMark is just one of many silly synthetic benchmarks. All of them in one way or another are going to biased or tweaked for some paticular hardware.
 
Back