• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

and the IQ is differant again on 3dmark05

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
JeffP said:
mica I kept it nice and simple. Is the ATI camp going to cry and ***** everytime a new game is released that has DTS or OpenGL or whatever other unique features that enhace the experience for Nividia owners and not ATI? I just don't see nearly as much complaining from the Nvidia guys, maybe thats because they are to busy actually using their cards in games. :bday:

ok, no one is crying about any game or games that use DTS in any way.
farcry seems to use it, and I couldn't care less.
now Doom 3 seems to have tons of optimizations of nVidia cards and basicly none for ATI.
again, I don't care.
I enjoyed both games, and clearly so did others.



I think people get way to wrapped up in the benchmark results and just look for something wrong. Who cares really, do you enjoy your ATI and the game play it offers if so whats the fuss. :beer:

the fuss is that we are not talking about a game.


3dMark is just one of many silly synthetic benchmarks. All of them in one way or another are going to biased or tweaked for some paticular hardware.

and that's the point, 3dmark was never a game benchmark, it was a DX benchmark....and IQ was always, in most ways, comparible to each other.


am I making sence or what?

mica
 
micamica1217 said:
when your going to post facts about this test, please include ALL the FACTS.

like how there is not one driver for any IHV at this time, that is official for anything but the default tests.
how about that all the test pics you show are using the app and not the CP of the cards....

heh my post clearly says "Application AF" right next to the pics, its even in your quote of my post

btw, when using the CP for ATI cards, all banding and IQ problems are gone.

And performance drops by about 10% with the 8.07 drivers using CP-AF. Therefore, in order to maintain IQ parity in terms of AF quality with Nvidia, it must be considered that scores run using Application AF (most) are inflated by about 10%.

then why don't you talk about how even microsoft says you will never get an image the same as the rasterizer on any card in all parts...it's just not possable with all the filtering going on with the cards anyway.

Microsoft actually states that NV40's filtering is superior to refrast, to quote:
Microsoft said:
The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer.

then there is your thoughts on DTS....

Man will you stop saying DTS? DTS is a home theater sound format. DST is Nvidia's shadow technology.

DTS = Digital Theater Systems
DST = Depth Stencil Textures

And my thoughts on DTS are that it kicks Dolby Digital's butt :)

since DTS is not now, or ever will be part of DX9.0 in any way, shape, or form, because it is a vender specific extension and not part of SM2.0 or SM3.0

We could say the same about 3dc, as nothing official has been announced for it for the next directx and it won't be incorporated until DX10 at least. Like 3dc, its currently under discussion whether DST will make it into the next DirectX. Futuremark claims they use PCF+DST because it is a part of DirectX for Microsoft's XBOX. Anyway, DST can be disabled.

so in turn, because of DTS, 3dmark05 is no longer a DX9.0 benchmark, and is now a game benchmark that does not use any game engine....lame.

I also question the usefulness of 3dmark05, because I think it is a technologically outdated benchmark. I snipped the rest of your post because I am also disappointed with 3dmark05. It claims to test SM3.0, and fails, not using most of the featureset that would increase performance - heck it doesn't even use geometry instancing, nevermind the longer instruction lengths or dynamic branching. Heck it doesn't even take advantage of SM2.0B. It doesn't use OpenEXR HDR Lighting that current Nvidia and future ATI cards will use.

When you are arguing about this, think about what you are arguing for. You are complaining that it uses DST, but what you should really be complaining about is that it doesn't use *enough* of the new features. It doesn't truly use SM2.0B (dx9.0c), it doesn't truly use SM3.0 (dx9.0c), it doesn't use OpenEXR HDR (dx9.0c), it doesn't use 3dc (nonstandard atm), it doesn't use virtually any technology that was put on a GPU this year. If you want to complain about anything, complain that 3dmark05 measures the potential performance of last year's cards, not this year's cards. 3dmark05 is a good benchmark for the FX 5950 versus the 9800XT, but fails as a useful benchmark for the 6800 Ultra versus the X800XT as it lacks the shader technology to fully take advantage of either of the latter cards. It isn't even a useful game benchmark, again unless you are measuring last years games, as this years most popular DX9 games (i.e. FarCry, Half Life 2, Painkiller: battle out of hell, lotr: battle for middle earth) and next years most promising games (i.e. unreal 3) are using technology that 3dmark05 simply does not incorporate. That should really be the issue - that it fails to predict potential performance for this year's hardware or even this year's most popular DX9 games due to its outdated shader code.

BTW, WELCOME TO THE FORUMS. :D

mica

thx ;)
 
Last edited:
Sorry to step in but bashing ATI fanboys opened the war path...

tranCendenZ said:
Microsoft actually states that NV40's filtering is superior to refrast, to quote:

Superior to what? what facts did they show to back it up? Unless you post them it sounds as a mere microsoft -> nvidia buttkiss
tranCendenZ said:
We could say the same about 3dc, as nothing official has been announced for it for the next directx and it won't be incorporated until DX10 at least. Like 3dc, its currently under discussion whether DST will make it into the next DirectX. Anyway, DST can be disabled.

I perfectly agree with you, but mica already told you the fact is its not dx9 so IT MUST NOT BE USED, same as happened to 3dc. At least it shoud be OFF BY DEFAULT. Tell me why is that distinction???
what do you have to say about this?

I smeel nothing but big $$, same as the patch in 3dmark03 that allowed wide open quality-raped nvidia drivers.

tranCendenZ said:
Futuremark claims they use PCF+DST because it is a part of DirectX for Microsoft's XBOX.

Very true and very questionable, its part of XBOX, THIS IS COMPUTER GAMING, period
 
Isnt Ati making the new Xbox graphics??? Couldnt ATi use DST in the future because of this?
 
tranCendenZ said:
heh my post clearly says "Application AF" right next to the pics, its even in your quote of my post



And performance drops by about 10% with the 8.07 drivers using CP-AF. Therefore, in order to maintain IQ parity in terms of AF quality with Nvidia, it must be considered that scores run using Application AF (most) are inflated by about 10%.



Microsoft actually states that NV40's filtering is superior to refrast, to quote:




Man will you stop saying DTS? DTS is a home theater sound format. DST is Nvidia's shadow technology.

DTS = Digital Theater Systems
DST = Depth Stencil Textures

And my thoughts on DTS are that it kicks Dolby Digital's butt :)



We could say the same about 3dc, as nothing official has been announced for it for the next directx and it won't be incorporated until DX10 at least. Like 3dc, its currently under discussion whether DST will make it into the next DirectX. Futuremark claims they use PCF+DST because it is a part of DirectX for Microsoft's XBOX. Anyway, DST can be disabled.



I also question the usefulness of 3dmark05, because I think it is a technologically outdated benchmark. I snipped the rest of your post because I am also disappointed with 3dmark05. It claims to test SM3.0, and fails, not using most of the featureset that would increase performance - heck it doesn't even use geometry instancing, nevermind the longer instruction lengths or dynamic branching. Heck it doesn't even take advantage of SM2.0B. It doesn't use OpenEXR HDR Lighting that current Nvidia and future ATI cards will use.

When you are arguing about this, think about what you are arguing for. You are complaining that it uses DST, but what you should really be complaining about is that it doesn't use *enough* of the new features. It doesn't truly use SM2.0B (dx9.0c), it doesn't truly use SM3.0 (dx9.0c), it doesn't use OpenEXR HDR (dx9.0c), it doesn't use 3dc (nonstandard atm), it doesn't use virtually any technology that was put on a GPU this year. If you want to complain about anything, complain that 3dmark05 measures the potential performance of last year's cards, not this year's cards. 3dmark05 is a good benchmark for the FX 5950 versus the 9800XT, but fails as a useful benchmark for the 6800 Ultra versus the X800XT as it lacks the shader technology to fully take advantage of either of the latter cards. It isn't even a useful game benchmark, again unless you are measuring last years games, as this years most popular DX9 games (i.e. FarCry, Half Life 2, Painkiller: battle out of hell, lotr: battle for middle earth) and next years most promising games (i.e. unreal 3) are using technology that 3dmark05 simply does not incorporate. That should really be the issue - that it fails to predict potential performance for this year's hardware or even this year's most popular DX9 games due to its outdated shader code.



thx ;)

OK, see, now I know what your trully getting at, and parden me folks as this is going to be get really off topic here.

before I start, I'd like to thank you for correcting my typo on DST (DTS...lol).
I was speaking about audio all last night and it was just funny to see your reply and how I made such a funny mistake.

anyway, let's start at the top.
at 4xAA and 4xAF, I get 4614 marks with the app.
I get 4762 marks with the CP.
this is more like a 2% differance not 10%, and did you notice just when I got this boost in scores?
I got the boost in scores when running the CP filter settings.
I repeated both tests to make sure all is good.
now apple740, also tested this, and did get a performance decrease using the controle pannel, but it was closer to 5%.
because I'm sooo OCed in all my parts, I'm quite sure that others will get differant results then both of us, too.

I'm kind of wondering just WHO you are listening to?
and that brings me to the whole "this is not a current card test" as you claim.
(what? because it doesn't use all SM2.0 and SM3.0 it's not current?)

first off, while SM3.0 might have been added at the last moment, it looks more like it was done, or redone with SM3.0 in mind...
and that's just my thoughts, and no one elses.

but really, what are your thoughts on SM3.0?
do you think that you'll always get a performance boost when using SM3.0, such as dynamic branching and geometry instancing?
and your thoughts on HDRL? do you think that you'll always get a performance boost with that increased IQ?

most SM3.0 is not, I repeat, not, a performance inhancment over SM2.0.
and at times, you have to be real carefull as to just when to use SM3.0 type rendering, because it could be a hinderance to framerate.

Farcry doesn't use dynamic branching, it uses static branching. why?
you might want to understand that unlike what most FUD you see each day in forums all around the world, SM3.0 can hurt the 6800 cards more then helps them.
that's a fact.
there will be times, when SM3.0 will help in both IQ and performance.
but if it's not used in the most wisest of ways, it could drop performance into the unplayable state of framerates.

since many people don't clearly understand just how poorly SM3.0 will perform on the latest cards, I advise that you do a search of my name and look for a thread on SM3.0 and Farcry, to see why adding more parts of latest tech, just for the sake of adding it, could HURT your results in both games and futuremark's 3dmark05.

adding some parts or features, to futuremark's 3dmark05, when it does not benifit from it in any way, is not what this benchmark needs, nor what any game, including farcry, should do.....IMHO.

maybe in a year or two, both ATI and nVidia will have cards fast enough to really use SM3.0 like SM2.0.
and then there are games, that may use only tiny amounts of SM3.0 in the next 3 years.
don't expect a game to truly use large amounts of SM3.0 till 2006-2007.
(and even then it will be only used when it helps, not hurts older cards.)
even games such as EQII will only use small amounts of SM3.0, just what makes you think SM3.0 is a "be all, end all" in framerate performance as compared to SM2.0???

it is clear that the NV40's are hindered by longer shader instuctions.
if at times, SM3.0 could have longer instructions, how could it be quicker then SM2.0?

now I know we have gone really off topic here, and you might not like 3dmark05 for differant reasons then me.
yet I still strongly advise that you might want to start a new thread, instead of saying that I should start arguing in a way that you want me to.
after all, again, that is not the point of this thread.

I do want to thank you for your time and thoughts.

(oh, and one more thing. no, you can't turn "OFF" DST in the default test, so it is really an issue that is needed to be discused and fought to correct in this benchmark, IMHO.
I was hoping that we could be able to do an "apples to apples" comparison with this test, and it seems that we might not be able to....that's sad, if you ask me.)

mica
 
micamica1217 said:
maybe in a year or two, both ATI and nVidia will have cards fast enough to really use SM3.0 like SM2.0.
and then there are games, that may use only tiny amounts of SM3.0 in the next 3 years.
don't expect a game to truly use large amounts of SM3.0 till 2006-2007.
(and even then it will be only used when it helps, not hurts older cards.)
even games such as EQII will only use small amounts of SM3.0, just what makes you think SM3.0 is a "be all, end all" in framerate performance as compared to SM2.0???

i agree with mica completely... just look at ati r9600 and r9700, they do support dx9 but not fast enough for 1280x960 or so... i dont want to talk about fxs ROFL

which resumes to: buy to use what you have now... not future features that when trully implemented will make your hardware perform and feel old
 
PhobMX said:
... just look at ati r9600 and r9700, they do support dx9 but not fast enough for 1280x960 or so... i dont want to talk about fxs ROFL

which resumes to: buy to use what you have now... not future features that when trully implemented will make your hardware perform and feel old

well said, and was my point exactly.


OT: btw, this was the link I was talking about with Farcry and SM3.0.

http://www.ocforums.com/showthread.php?t=309362&page=1&pp=30

mica
 
tranCendenZ .................. I just want to again mention that mentioning the lack of SM3.0 features which Nvidia supports and are missing in 3DMark 2005 is not relevant to this thread . There are 3 reasons :

1/ The already stated initial reason for this thread ........... Why is a non-DX9 NV feature (DST) allowed in 3DMark 2005 , when 3DC was blocked for being non DX9 .

2/ Several of NV's SM3.0 features cannot be reasonably used with their current hardware and NV staff has also said the same . In fact when Standard DX9 code is used their newer cards still lose to ATI even when NV is doing less passes with PS3.0 . Do you really believe that the archietecture will suddenly do much better when more is asked of it ??

3/ Futuremark has always built a benchmark based on what developers and hardware vendors tell them will be used in the next 18 months or so . I'm sorry, advanced SM3.0 just is not on the cards ( no pun ) due to the installed hardware base . thus look at the history and see as i said here :

"Is that not the way of all 3DMark versions ??? For example 2001SE has more DX7 than DX8 by far and 2003 is mainly DX8 and even includes a DX7 gametest but hardly any Dx9 . And therefore it is no small wonder that this new 3Dmark is mainly plain SM2.0 and very little 2.0b or 3.0 . "

So expecting advanced SM3.0 in ver 2005 is not realistic .
 
so your saying ps3 and sm3 does not matter right now, also the ps3 current 6800gt cards have is "premature" and lacking some parts to speak. In this case I could view the x800pro and 6800gt as equal with the 6800gt faster in doom3 and the x800pro ahead in hl2(if it ever comes) and farcry
 
I think that's pretty much what most reasonable people have stated all along. The X800 and the 6800 cards both have strong and weak points and will perform differently because, well quite frankly, they are different products.
 
exactly. you cant lose either way if you choose a 6800gt vs. x800pro. some people happen to like what the 6800gt has, others the x800pro and ive given the x800pro serious though. I might buy one of those for xmas in 2005 when prices drop to $200 and when I have a winchester at like 3.5GHz if x800pros are still over $200 I might take the x800se then
 
tranCendenZ thankyou very much. Mica and the fanboys often get on their pulpit and resort to the IQ issue. I was unaware of the things you mentioned until now. Thankyou very much.

Sorry to step in but bashing ATI fanboys opened the war path...
.....Well put :mad: . No wonder this has turned into a nV bash fest.

i agree with mica completely... just look at ati r9600 and r9700, they do support dx9 but not fast enough for 1280x960 or so... i dont want to talk about fxs ROFL
Just keep digging....

________________

Like I said before. The 6800 series is a fantastic grafix card. DESPITE what Mica and others say.

Alot of its issues are driver bugs and will get ironed out. Finally I'd like to once again thank tranCendenZ for giving some facts to help balance out the ATi fanboys.

In short, ATi is no better than nV, so stop with the bashing. They both cheat for their own ends. Get over it...nV has just taken it a step further :D

___________________

I'd also like to agree that 3d05 is a **** poor benchmark. It looks inferior and reminds me too much of 3d01 with DX9 effects. Im actually quite disapointed by this bench and agree with [H]ard|OCP's stance with it.
__________________________

Finally Im not posting in this thread anymore, because Im sick and tired of hearing how my 6800GT sux and how ATi is god. nV has issues, but the info being thrown around here is so horribly biased that its no longer worth my time
 
Last edited:
Sentential ............. everyone is entitled to his or her own view , I certainly don't think that the GT is a bad card at all . I don't think anyone said so , at least not in this thread . I'm not 100% sure who in the thread is being referred to as being biased and what info in particular is considered by you to be biased. What I really don't like is the fact that a member made a thread to address or discuss a particular issue and some interesting but somewhat irrelevant info has now been posted and has derailed the thread . It has gone from a Futuremark dropped the ball, into an ATI vs Nvidia thread . Exactly what we have too much of .

Why must someone think that the way to answer one issue is to attack on a totally different one , why must there always be this rush to even out a scale in an ATI vs NV argument . Certainly I could give link after link of the same banding described above in the GT . There are many posts all over the web complaining of texture crawling and shimmering with the NV4x but it is just not relevant . Members keep jumping into threads to defend their preffered company , but then fail to answer the issue by using irrelevancies .

Sigh .
 
I thought that 3Dmark was proven to be irrelevant as a GPU gauge with '01....
I play with it to see how high a score I can get comparing my non-OCed speeds to my OCs to see what happens, and to enjoy the eye candy. :)

There also remains that Mica refuses to post his screenshots so that we can all directly compare his to Sent's. I don't dispute the facts between the cards, but some of us would love to see back to back screenies is all.
 
Sentential said:
tranCendenZ thankyou very much. Mica and the fanboys often get on their pulpit and resort to the IQ issue. I was unaware of the things you mentioned until now. Thankyou very much.


.....Well put :mad: . No wonder this has turned into a nV bash fest.


Just keep digging....

________________

Like I said before. The 6800 series is a fantastic grafix card. DESPITE what Mica and others say.

Alot of its issues are driver bugs and will get ironed out. Finally I'd like to once again thank tranCendenZ for giving some facts to help balance out the ATi fanboys.

In short, ATi is no better than nV, so stop with the bashing. They both cheat for their own ends. Get over it...nV has just taken it a step further :D

___________________

I'd also like to agree that 3d05 is a **** poor benchmark. It looks inferior and reminds me too much of 3d01 with DX9 effects. Im actually quite disapointed by this bench and agree with [H]ard|OCP's stance with it.
__________________________

Finally Im not posting in this thread anymore, because Im sick and tired of hearing how my 6800GT sux and how ATi is god. nV has issues, but the info being thrown around here is so horribly biased that its no longer worth my time


hmmmmm.....
let's see what I said in some of my replys:

post #1

micamica1217 said:
let me also say that we are not being duped by nVidia this time around, we are being duped by futuremark "again".

post #7

micamica1217 said:
as this thread is all about IQ comparisons, and how we now can't do an "apples to apples" across all cards....AGAIN!!!

thanks futuremark for nothing.

post #17...(my relpy to you)

micamica1217 said:
let me be clear, I AM NOT BASHING NVIDIA AT THIS TIME FOR ANY LOSS OF TEXTURE IQ OR SHADOWS, AS IT IS FUTUREMARK'S CHOICE TO ADD VENDER SPICIFIC OPTIMISATIONS THAT HAVE NOTHING TO DO WITH DX9.0c.
lol...I can't believe I kept saying DTS instead of DST. lol

with all the posts that are listed by me (I will not speak for anyone else in this thread), I have clearly stated that nVidia is not at fault in this topic of discussion.

I've also stated that ATI will have some jaggies too, since we are testing at 1024x768 with no AA.....so it's not like there are no jaggies in this test(s).
if your mad that I've brought this up, why did you not say so, earlyer?

what changed from a friendly testing and discussion, to a rant on how I'm some fanboy standing on a pulpit trying to bash nVidia?

I've said it before, and I'll say it again....with the free version of 3dmark05, we now seem to be unable to do an "apples to apples" comparison, and it is clearly futuremarks fault.
agree or disagree, it's all good by me.

I hope this clears the air.

mica
 
sentential, thought you wouldnt post anymore in this thread... now that you are back id like to quote my statements you didnt mention...

PhobMX said:
Superior to what? what facts did they show to back it up? Unless you post them it sounds as a mere microsoft -> nvidia buttkiss


I perfectly agree with you, but mica already told you the fact is its not dx9 so IT MUST NOT BE USED, same as happened to 3dc. At least it shoud be OFF BY DEFAULT. Tell me why is that distinction???
what do you have to say about this?

I smeel nothing but big $$, same as the patch in 3dmark03 that allowed wide open quality-raped nvidia drivers.

Very true and very questionable, its part of XBOX, THIS IS COMPUTER GAMING, period

very clever of you picking the worst parts of my posts, but ok, i apologize about the fanboy stuff but as all above points say, im speaking about 3dmark05 here. I was just a little angry tranCendenZ fails to see the problem here:

Most people nvidiots or fanatics alike just run the benchmark, they wont disable dst thus the benchmark is unbalanced. It should be off by default

and about this:

PhobMX said:
i agree with mica completely... just look at ati r9600 and r9700, they do support dx9 but not fast enough for 1280x960 or so... i dont want to talk about fxs ROFL

i was bashing BOTH r9xs and FXs... but what i meant is that people that bought those cards to use dx9 for 2 years (gotta admit i was lured by the same reason) are the kind of people that looks at "features" that you cant even use to their full extent and praise a product over another because of them (AKA fanboyism).

This is a problem over ATI vs. nVidia wars, features of a product dont proove it better. This is all about performance, and if you use newer features on a product that lacks the horsepower to do it then the point gets moot... would you quote here this time pls? ;) im cool dude are you?
 
Last edited:
PhobMX said:
This is a problem over ATI vs. nVidia wars, features of a product dont proove it better. This is all about performance, and if you use newer features on a product that lacks the horsepower to do it then the point gets moot... would you quote here this time pls? ;) im cool dude are you?

That statement is so very untrue it is not even funny. What makes you think that video cards aren't about the features they offer, as well? Let's take Athlon 64 for example. The fact that it incorporates an on-die memory controller eliminates many performance issues that were once seen with their 32-bit lineup (say on NF2 vs. VIA motherboards). As a result, the 64-bit arena is not about performance as it is features. Each and every Athlon 64 bit either possess or lacks a number of features targeted at specific or general audiences because of the fact that the performance differences are negligible. Performance is important but it is indeed not everything.

In terms of this entire thread, it is my opnion that such benchmark inconsistencies are only further proof that we as a community simply need to let go of such base forms of measurement. To be fair, I must admit that I run certain benchmarks from time to time, but it usually to ensure that my computer is running optimally especially in the midst of a newer driver release/uninstall. For the most part, however, we need not scrutinize over every little issue here and there. The fact of the matter is that the recent ATI/NVIDIA lineup are much more competitive than the last release and, therefore, there is really no way someone can go wrong.

Besides, Futuremark has never been a good indicator of actual game performance which in reality is what matters to most of us. In the end, FM can choose to include or exclude certain features of each card because that is there right as a software maker. All we can do is inform the public and decide whether or not to support such a product. The big problem I have with some of these threads is that they almost always turn out to lean toward some form of fanboyism EVEN WHEN THAT IS NOT THEIR ORIGINAL INTENT. I understand that this is a very hard thing to prevent but I do feel that there are SOME (not all) individuals that would never be willing to accept a legitimate IQ advantage on NVIDIA's behalf (I am not currently stating that one exists just simply making a comment). Conversely, there might be a time in which ATI will be in the same position. Neither aspect is right, and I think we should all concentrate a little less in FM benchmarks and a little more on games and anything else you run with your card.

deception``
 
deception`` said:
What makes you think that video cards aren't about the features they offer, as well?

Athlon64 has both features and performance. I said that r9600s and 9700s that get down to their knees when you use true dx9 features because of the lack of horsepower is a case of features and no performance. Maybe i wasnt clear.
The same goes for 6x00 and SM3, when games really incorporate sm3 in a couple of years from now (at the very best), they will lack the horsepower needed to enjoy them.
So again this is not an ATI vs Nvidia statement, is just a fanboy vs objective consumer statement.

And why this is not a thread hijack?

Because when you enable by default a non-standard feature you stop comparing "apples to apples" and the objective consumer gets a hard time, while the fanboy gets even more juice to cry out loud "my brand ownz j00".

I agree that a little of research might point objective 6x00 owners to disable dst, but i would bet all my hardware that not even 1 out of 5 will ever do it. Sad but true :(
 
deception``,

I just read your last reply, and have to say, that I agree with you on many points.

I was even going to add to this thread the minor SM3.0 type IQ differances as compared to SM2.0, using a 6800u as a source.
most of the differances was with the water in the last test.
they were realy minor, but they were there.

this would also give the x800's a lower IQ as well, since by default it was the only option.

since, I feel that futuremark should have the right to include such differances (SM2.0 vs SM3.0) in it's game test, I realy didn't include it here in this thread.

reading many of the (private) replys from futuremark on this subject, I've never seen anything that sugjests a statement about xbox.
only stsements saying that other games that will come out will use DST, so they thought it would be fine to use it.

again, my biggest problems with this are:
DST doesn't give the same quality output shadows that turning off DST would give in the 6800 cards.
to include it would be fine as an option to show the possable performance increase with it's use, but should be off by default.
(I would say the same for any ATI feature such as 3Dc)

could you imagine if 3Dc was used by default, showing slightly better IQ textures in all tests, and possably running even faster?
now that would really confuse the joe-sixpack gammers even more.
such as in this thread....http://discuss.futuremark.com/forum...umber=4461141&page=&view=&sb=&o=&fpart=1&vc=1
clearly I can go on as to why this persons testing methods are soooo wrong, but that would really be way off topic here.

to be honest, if you like the fact that futuremark went from a stance of testing hardware using DX only, to futuremark testing hardware using DX as well as some vender specific optimisations....then that's cool with me.

by the time the next 3dmark comes out I might have my next nVidia card in my computer....and it will most likely have crazy amounts of ATI optimisations in it. lol
then when I talk about the IQ differances, I'll be called a nVidia fanboy. lol

remember that all I'm saying here in this thread is that I'm deeply shocked at futuremark's latest stance on how there software will test.
(and that they never used to think this way.)

it's all good, even if I don't like it.
after all, I only use this type of benchmark like you do, and just tweek my system with it.

mica
 
PhobMX said:
sentential, thought you wouldnt post anymore in this thread... now that you are back id like to quote my statements you didnt mention...

... would you quote here this time pls? ;) im cool dude are you?
Yea Im fine, lol:D. Im just really touchy when it comes to nV bashing. Especailly since I was a dumbass who bought a 5200FXu...So thats where it comes from.

I'd also like to say that I had the option of getting an x800XT PE for $479:bang head (which I didnt take, instead got a 6800GT for $354) Sure I saved $100, but now im begining to wonder if that was the right choice.

So yea, I am pissed... at you guys? No. Myself (again)... yes, as usual :temper: :cry:

But forgive my ranting and raving. I tend to take things way too personally :bday: It still irrates me, cuz somehow in the back of my mind I can hear the sound of me getting shafted by nV once again.

As much as I liked ATi, their stability has been a real problem for me. Im not exactally sure what Im gonna do at this point. If nForce4 doesnt have AGP that will be the final nail in the coffin and my permenant ban of nV products from my rig....

But yea, I appologies bro. Its not right for me to take out my anger on other people. Ill definaly keep this in mind from now on. Besides, I can always vent frustration in games, instead against my OCing bretheren.

- Sen

______

P.S.

yes I agree, the FX series sux. And if I find the mother****er who designed the 5200FXU Im gonna give a swift kick in the groin, so that we keep the genepool clean of retarded people such as him. May he lead an interesting life....
 
Last edited:
Back