• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

and the IQ is differant again on 3dmark05

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

micamica1217

Member
Joined
Jun 1, 2002
dst_diff.jpg

this pic is with and with out DTS.
DTS is on by defalt with all nVidia tests in 3dmark05.
but the developers only used a nVidia only string in coding it....so ATI cards can't do it, and is off by default.

this not only gives a higher score fore nVidia by upto 16%, but gives lower IQ then ATI results.
it also looks like the same result you get in farcry with the 1.2 patch and harsh shadows(look below at the shadow on the gun...note the jaggie shadow done on a 6800gt, that on ATI cards will be nice and soft(no jaggies)).

pic by Sentential
FarCry0005.jpg


here are some thoughts on the matter....

Hexus said:
Hopefully the difference is clear. The shadow umbra (the shadow's edge) is much softer with DSTs turned off, hinting that the sampling done in the fragment program when testing the depth value in the shadow map produces better results than the PCF sampling in compatible NVIDIA hardware. The image quality is worse with DSTs, the shadow edges are harsher than they should be, and the staircase stepping issue due to depth map resolution is exacerbated by the filtering method. It's not giving the results needed for good umbra rendering, in this reviewer's eyes.

So while DSTs improve performance on compatible hardware, to the tune of anywhere between 10% and 16% in our tests, they most certainly do not give good output in 3DMark05's case, compared to the case where they're turned off. Again, they're ON by default on compatible hardware, increasing the chances of a non apples-to-apples comparison between hardware than can do DST acceleration, and hardware that can't.

Hexus said:
Futuremark's use of DST acceleration, in combination with single-cycle PCF filtering that's used for sampling of the DST, a method that has no part in PC-based DirectX and lies far outside Futuremark's previous claims that they'd not use vendor-specific extensions to render their 3D benchmarks on the PC, is the sticking point for me. That it's enabled by default on hardware that supports it is even more disagreeable. Combine that with the fact it reduces image quality (in my eyes) in the quest for performance, and we're back to a point we had during 03's lifetime, but this time it's supported by Futuremark and so it's valid in some respects.

When you also understand that Futuremark declined to put 3Dc support into 3DMark05, something that'd help reduce vertex load on a vertex-fetch-limited benchmark, for the precise reason that it's not a part of DirectX, is galling. Neither is NVIDIA's DST+PCF method of PSM acceleration, so why's that in there?

3Dc is a ratified part of WGF 1.0. It'll at least be in DirectX at some point in the future. The same can't be said of the NVIDIA acceleration of PSMs in 3DMark05.

Beyond3d said:
Here's something I wrote Futuremark yesterday :

Reverend wrote:

Finally have some time to think about this in a "serious" manner.

It's too late for any comment of mine to have any effect -- it's gold. So I'll instead stick to talking about how bad I think the BDP program is.

You guys have been gathering feedback on basics from way back. You collate them, discuss internally, ask for some more feedback and you go ahead and decide how to make 3DMark05, for eaxmple, "Smoothie" was tossed around early on but no feedback was forthcoming from FM about what kind of shadows has been decided upon by FM. Essentially, as far as B3D is concerned, me and Dave had absolutely no idea what had been decided upon by FM. A better way, but perhaps much more time-consuming and will probably delay 3DMark, is for FM to tell BDP members "OK, after getting the feedback from you guys, we think this is what we're doing with the game tests... blah, blah explaining to us what you want to do... and then WHAT DO YOU BDP MEMBERS THINK OF WHAT WE'RE DOING WITH EACH GAME TEST?" Sure, you asked the BDPs for opinions on certain "default settings" but you never asked the BDPs (well, you didn't ask B3D, not sure if you asked other BDP members) about this DST thing.

It's just not satifactory. I realize there must come a time when FM has to make a decision (otherwise, you'll never get 3DMark05 done if you keep asking for feedbacks and opinions) but I feel more can be done to involve the BDP members without delaying a project. Progressively providing information to BDP members about certain crucial areas is what I'm saying.

I just learned the intricacies about DST yesterday. I never played with it on a NVDIA hw. My knowledge is based on what DX tells me and that was simply the ability to query, set texture format and away we go. I had no idea how NVIDIA hw implements this. In a big way, I blame MS for this obscurity (and there are other obscurities and non-defined stuff in DX).

But on to the main topic -- 3DMark is not a game. It is a benchmark meant to show how different hardwares perform on standard, hw-platform-wide API features that the benchmark author can control.

While FM can implement features that FM thinks will be "standard practice" in games, FM needs to ensure that everything they implement in every 3DMark favours no IHV or specific hw without having obscurities in the API influencing their decisions.

Taking another parallel -- depth bounds checking is good, no arguments. Should it be implemented as a DEFAULT, and therefore penalizing hw that do not have support for this?

I think FM has made the wrong decision wrt to this particular DEFAULT setting (DST). Sure, we can disable DST (and this is in fact what you recommend for better hw comparisons) but the basics are wrong. The ORB itself will be tainted, IMO.

I re-iterate : 3DMark is not a game. If FM wants to make a game, make a game. If FM wants to make a benchmark, the considerations are a whole lot different than those associated with making games. (end of note)


The point is that if this feature is basically a IHV-specific extension (I believe there's a NVIDIA-specific extension in OpenGL), then it should NOT be enabled by default. Dave and I have no argument with it actually being used; we just have issues with basically the "core values" (Dave's words) of Futuremark and what they constitute, simply by virtue of Futuremark deeming that games would enable this too by default -- 3DMark CANNOT use IHV-specific extensions/features/whatnots. It Is Not A Game.


let me also say that we are not being duped by nVidia this time around, we are being duped by futuremark "again".

mica
 
Last edited:
I am not quite sure I understand why people are complaining this benchmark is designed to run with an option that futuremark put something in a benchmark that they deem will reflect the future of gaming. Nvidia happens to have something going for them in adding extra features and since last time I checked a x800(with hot fix) is the leader in 05, so this is a somewhat moot point.

just my .02
 
LOL who is complaining about this ati users that are on top ? think they should be alot higher . ? heh didnt we just see a x800 xt pe beat 2 6800's in sli mode =) or nividia users ( who allready hate there scores jk guys ) ? i can see ppl complaing if the 6800's were rulesing the land but that doesnt seem to be the case
 
speed bump said:
I am not quite sure I understand why people are complaining this benchmark is designed to run with an option that futuremark put something in a benchmark that they deem will reflect the future of gaming. Nvidia happens to have something going for them in adding extra features and since last time I checked a x800(with hot fix) is the leader in 05, so this is a somewhat moot point.

just my .02

the points are, is this a DX9.0c benchmark, or is it a gaming benchmark?
and if it's a gaming benchmark, why no 3Dc?
and why are we now allowed 2 differant IQ results to be OK?
who gets better framerates, has nothing to do with this.
futuremark needs to reply to this, and quick.

the hexus site says it all....

When you also understand that Futuremark declined to put 3Dc support into 3DMark05, something that'd help reduce vertex load on a vertex-fetch-limited benchmark, for the precise reason that it's not a part of DirectX, is galling. Neither is NVIDIA's DST+PCF method of PSM acceleration, so why's that in there?

3Dc is a ratified part of WGF 1.0. It'll at least be in DirectX at some point in the future. The same can't be said of the NVIDIA acceleration of PSMs in 3DMark05.


mica
 
Kah! i know no one cares about the older cards but nvidia cards (59xx) that can match ati (9800p) in 3dm03 or lose by a few hundred points...are getting blown awaaayyy....1200marks to 3000marks even beaten badly by 9500pros and crappy 128bit ati cards!....i know this is about the x800 vs 6800's but so far theres liek 1 nvidia card in the top 20....so i dont see the big deal here, if you ask me so far 3dmark05 is crap and doesnt prove a damn thing.
 
Drec said:
Kah! i know no one cares about the older cards but nvidia cards (59xx) that can match ati (9800p) in 3dm03 or lose by a few hundred points...are getting blown awaaayyy....1200marks to 3000marks even beaten badly by 9500pros and crappy 128bit ati cards!....i know this is about the x800 vs 6800's but so far theres liek 1 nvidia card in the top 20....so i dont see the big deal here, if you ask me so far 3dmark05 is crap and doesnt prove a damn thing.

if 3dmark05 doesn't prove how the 5900's can't do SM2.0 well, as I've said for about 1 1/2 years, then I don't know what else (that is out now) does.

but this is highly off topic, as this thread is all about IQ comparisons, and how we now can't do an "apples to apples" across all cards....AGAIN!!!

thanks futuremark for nothing.

mica
 
||Console|| said:
LOL who is complaining about this ati users that are on top ? think they should be alot higher . ? heh didnt we just see a x800 xt pe beat 2 6800's in sli mode =) or nividia users ( who allready hate there scores jk guys ) ? i can see ppl complaing if the 6800's were rulesing the land but that doesnt seem to be the case

It's not about who is on top -- its about *how* you get that score. Its like giving NVidia some extra things it is optimized to do for some extra points, whilst leaving ATi high and dry to hard render every bit.

Its like giving someone a 10m head start in the 100m sprint -- but ATi have it in the bag again and are proccessing better even with NVidia 10m ahead!

They got (very publicly) caught before, and now we get people who fully understand this pulling it apart and we find similar stuff, just deeper so we cant find it.

If i hasnt already solved my confusion over getting an X800 or 6800, this just put the final nail in the coffin. PS3 might be 6800's saving grace, but ATi didnt include it, so in trusting ATi (i have no reason not to) i dont think PS3 will be around for a while, and if it is going to come in 1 years time, expect ATi's new cards to have the FULL WORKING instruction set.

Sorry NVidia, your GF4 did me well but this is the second generation you are behind, and instead of correcting your mistakes you are just being decietful.

And dont start about Farcry using PS3 -- who do you think funded the re-scripting to PS3?

Sorry about the rant, its off topic of the IQ but i dont really understand that :p its on the topic of 05 and being dodgey... again

~t0m
 
Yea mica I must say that Im quite pissed off about this shader issue. It gets even worse on "Performance"

Bear with me Ill get a screenie of it.
 
First round. Also Mica, before you bash them. I want your pix, at the same location, shouldnt be THAT hard to do.

(NO AA, NO AINSO, Quality Settings, with hotfix)
Unacceptable1.jpg

unacceptable2.jpg

unacceptable3.jpg
 
Sentential said:
Ok, Mica...lets see yous. Then we will see if there are IQ issues beyond that shadow bug.

(copy and pasted from my investigation on FSAA and the 6800 cards in the nVidia section.)


I'm not quite sure how your IQ pic shows that the 6800 cards do FSAA...???
after all, I'm still seeing better quality AA on some parts of your pic(s) (the 2xAA and 4xAA) that are doing a great job, and some parts that are doing a poor job.

I also understand that the images are not of the same frame, so I'm not suprised that you are getting less AA on the box(s) at the front of the image on 4xAA then on the 2xAA pic.
(but as funny as it sounds, your 2xAA pic has better AA then your 4xAA pic in some parts...take another look, it's a night and day differance.)

but, I am still seeing parts of each image (the 2xAA and the 4xAA) that is better AAed then the rest. (now that is not a night and day diff, and I understand that most won't notice)

here is a IQ comparison of AA in both the x800 and 6800 cards.
the top is ATI, the bottom is nVidia.


this pic is at default benchmark settings, 1024x768, no AA and AF
IQ-returntoproxycon-default.png


note the lack of detail in the textures of the nVidia image behind the bots, and on the bridge to the door.
it looks like your pic on another thread.


1024x768, 4xAA and 8xAF
IQ-returntoproxycon-large.png


note a few things....
first, the spots that had some texture loss in the above pic is now at full detail on the nVidia pic.

next, let's look at 3 places on the image for FSAA detail:

1st, the two gardrails on the bridge....
note how the gardrail closest to us is AAed quite nice, yet the far one is NOT in the nVidia pic.
starting at this point (the far gard rail) that's were what I call "AA fall off" begins.

2nd, look just above the gard rail next to the door way, you'll see the number "2" on the wall (behind the pipes).
the lines in the wall are also not AAed as good as the ATI image.

3rd, the beam that holds up the bottom of the vat that falls in the demo and benchmark.
it too is no longer AAed as good as the rest of the nVidia pic.

(I wish that the examples showed even more of the bottom of the tests since you will see that in some games or benchmarks, this problem seems to be a "near to far" problem.
yet as noted by people who have the cards....
some games are more noticable then others, and some seem to be a problem at certain angles.)

let me be clear, I AM NOT BASHING NVIDIA AT THIS TIME FOR ANY LOSS OF TEXTURE IQ OR SHADOWS, AS IT IS FUTUREMARK'S CHOICE TO ADD VENDER SPICIFIC OPTIMISATIONS THAT HAVE NOTHING TO DO WITH DX9.0c.
(yet if you look at both the ATI shadows on the door, it is much softer then the nVidia ones...DTS, you gota love it. :/ )

but while you have tryed to show how nVidia does FSAA in at least 3dmark05, I see things quite differantly.
the ATI pics, taken at the same frame shows FSAA in all parts of the image...
I can't say the same for nVidia.

mica
 
I thinmk ATI is doing this "Jaggie shadowing" also because everyone of my games with my 9700pro has jaggie shadows. This never started until the past i dunno 5-7 driver releases.
 
Last edited:
yes, i just reran 3dmark05, and i can confirm, the bench with the flying ship has the zigzag shadows as well. i didn't grab a screen, but at the same spot sent posted, i had the same thing.

about the AA, i couldn't tell you. i ain't turning that on, it would take me all night to finish all the tests. ;)
 
Mica just post your screen shots of the same pics as Sentential already. Your beating around the bush or issue more then John Kerry. :clap:
 
Back