• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Farcry PS3.0 test from nVidia in London

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
http://www.xbitlabs.com/articles/video/display/farcry30.html

Good lord this is getting ssoooo rediculous. Now X-Bit labs is in on the bogus benchmarking with thier latest article supposedly showing SM3 advances in FarCry with unreleased drivers, unreleased game patches, and unreleased versions of DX.

"At this time we tested not using full-screen antialiasing and anisotropic filtering because of time issues. Once we complete the benchmarks using “eye-candy” modes, we will update our article."

No AA and no AF.....the whole article is useless from so many different perspectives....sigh.
 
Dragonprince said:
http://www.xbitlabs.com/articles/video/display/farcry30.html

Good lord this is getting ssoooo rediculous. Now X-Bit labs is in on the bogus benchmarking with thier latest article supposedly showing SM3 advances in FarCry with unreleased drivers, unreleased game patches, and unreleased versions of DX.

"At this time we tested not using full-screen antialiasing and anisotropic filtering because of time issues. Once we complete the benchmarks using “eye-candy” modes, we will update our article."

No AA and no AF.....the whole article is useless from so many different perspectives....sigh.

it gets worse....

all the graphs on that site are wrong.

looking at all graphs, the x800 cards get an increase in frame rate when using the new patch in almost every test...compleatly differant then every other site.

yet look at what they say at the end of the Conclusion:

For an unknown reason the RADEON X800-series graphics products’ performance slightly dropped in FarCry version 1.2 compared to the version 1.1. The reasons for the drop are not clear, but may indicate that ATI also has some speed headroom with its latest family of graphics processors and the final judgement is yet to be made…


bogus is what I call this review.
with posabilitys that they are also running with brilinear filtering turned on, and no AA and AF filtering tests, I call "nVidia biased" web site.

mica
 
Hopfully the next patch will fix the 1.2 error and hopfully Anantech will not make such a mistake again.
 
Not to mention that in their tests the NV40 wasn't using AA depite the graphs saying so . With those drivers and the 1.2 patch AA must be turned on in the game and the driver set to application preference , otherwise it will not work . That's why them and Anandtech have such phenomenal results for the NV40 with 4xaa 8xaf . While firingsquad has the X800XT winning whenever those options are on .
 
some more info on Farcry and SM3.0...

firingsquad's inteview with Yerli said:
FiringSquad: Are you using dynamic branching for the lighting or static branching?

Cevat Yerli: We’re using static branching. We’re using static because [with dynamic] there were a few performance problems there so we decided to use static branching ultimately to utilize the best features of 3.0. I was telling people previously at the launch event that the 3.0 shader model itself is great and all that but you have to use it [dynamic branching] wisely because it’s a very powerful subset and you can very easily get into a situation where branching is great but it will slow down your technology, and since that’s the case we’re using static branching to still get the performance up with our technology.


FiringSquad: Are there any plans to incorporate 3Dc into Far Cry?

Cevat Yerli: In fact yes. In the next patch, 1.3.

full read here...LOOK.



update from DH:
UPDATE 05/07/04

We have been in discussions with ATI/Crytek today over the issues we discovered with Radeon X800 rendering as detailed in the above article. Crytek have confirmed that the FarCry 1.2 build released today will no longer have these issues and that X800 IQ will be just as good in Patch 1.2 as it was in 1.1.

full read here....LOOK.

mica
 
Last edited:
ps 3.0 is going to be good in the near future with more and more games that are going to support it. i dont know why are going to such lengths to try to discredit the technology, it is a needed step forward. if ati was intergrating it in the r420 you would be on here touting it, as the end all be all. nvidia has one upped ati in the feature department and they are equal in speed, to any unbiased person the choice is easy. when they are on equal footing you always go with the most features. also hdr which you conviently left out of your selective posting will not be possible for radeon owners, that is going to have a great visual impact. the texture compression technology is just going to be used to speed up some thing like ps 3.0, they are not redesigning the textures to take full benefit of it.
 
coldfusion71 said:
ps 3.0 is going to be good in the near future with more and more games that are going to support it. i dont know why are going to such lengths to try to discredit the technology, it is a needed step forward. if ati was intergrating it in the r420 you would be on here touting it, as the end all be all. nvidia has one upped ati in the feature department and they are equal in speed, to any unbiased person the choice is easy. when they are on equal footing you always go with the most features. also hdr which you conviently left out of your selective posting will not be possible for radeon owners, that is going to have a great visual impact. the texture compression technology is just going to be used to speed up some thing like ps 3.0, they are not redesigning the textures to take full benefit of it.

I'm not discrediting SM3.0 at all...
I'm just saying that sofar it has not shown that it does anything better then SM2.0 in farcry when the x800pro/xt still uses patch 1.1 (and doesn't upgrade to patch1.2).
infact, almost no PS2.0 was used in the new patch...read closer to my findings.

as for HDR...
the ATI x800 cards can do HDR quite well.
to steal a quote from DB ""..... Technically SM2.0 can support HDR effects, its a point of detail that a developer is choosing not to support it.""

and I'm not quite sure what your talking about here...""the texture compression technology is just going to be used to speed up some thing like ps 3.0, they are not redesigning the textures to take full benefit of it.""
since the 3Dc compression is an ATI only feature, I don't see how this could help SM3.0 at all....you are talking about patch1.3 ???

mica
 
^^^^ can you not see - their PS3 support is turning out to be as bad as their PS 2 support - which was almost non existent!


So why buy a card based on a feature that is decrease performance ? Why buy a product because of a feature that is hurting the card performance and who says it will perform when the programs are out to use it?

Sorry , think i would go with the card that is sticking with reletivly recent technology and perfecting it and not worry about something we wont see for sometime and when we do ATI will have new cards out and hopefully so will NVIDIA and maybe with that round of cards they will have perfected ps3 - NVIDIA decided to go with PS3 because they knew ATI was going to be right up there in performance so they wanted to throw it in for marketing and for people like yourself who will buy it becuase it is one extra feature - even though it has no impact on current programs.

Personalyl i really do hope they get their PS3 in gear because i want to purchase with my new super system the SLI dual cards so i dont want to be stuck with some future proof crap feature :(
 
micamica1217 said:
I'm not discrediting SM3.0 at all...
I'm just saying that sofar it has not shown that it does anything better then SM2.0 in farcry when the x800pro/xt still uses patch 1.1 (and doesn't upgrade to patch1.2).
infact, almost no PS2.0 was used in the new patch...read closer to my findings.

as for HDR...
the ATI x800 cards can do HDR quite well.
to steal a quote from DB ""..... Technically SM2.0 can support HDR effects, its a point of detail that a developer is choosing not to support it.""

and I'm not quite sure what your talking about here...""the texture compression technology is just going to be used to speed up some thing like ps 3.0, they are not redesigning the textures to take full benefit of it.""
since the 3Dc compression is an ATI only feature, I don't see how this could help SM3.0 at all....you are talking about patch1.3 ???

mica
i am talking about the texture compression technology is going to have the same type of speed ,not visual enhancements as ps 3.0. i find it very funny that you are touting that and trying to discredit ps 3.0, when they are going to be used in far cry essentially the same way. nvidia used a higher quality hdr which will probably be dopted by future games. nvidia has better features this time out no matter how much you try to discredit them. nvidia implementing this kind of technology only helps them in the future, it is always hard at first to get the stuff performing right. they will have more expierence and will be alble to further perfect it. i am not buying the 6800 because it is future proof, i upgrade my cards every 6-12 months iam buying because of the features and the speed.
 
coldfusion71 said:
i am talking about the texture compression technology is going to have the same type of speed ,not visual enhancements as ps 3.0. i find it very funny that you are touting that and trying to discredit ps 3.0, when they are going to be used in far cry essentially the same way. nvidia used a higher quality hdr which will probably be dopted by future games. nvidia has better features this time out no matter how much you try to discredit them. nvidia implementing this kind of technology only helps them in the future, it is always hard at first to get the stuff performing right. they will have more expierence and will be alble to further perfect it. i am not buying the 6800 because it is future proof, i upgrade my cards every 6-12 months iam buying because of the features and the speed.

what I'm getting at with some of my last quotes is that the 6800 cards may still be TOO SLOW to fully use SM3.0...that is why they are not using dynamic branching.

as for the quote for 3Dc...it was pasted there as just a heads up for peeps.

now while I didn't speak about HDR performance (or posable performance)...
you do understand the performance penalty for using it?
think 1024x768 with no AA and posably 8xAF...40fps AVG (on a 6800u, OCed)
(the above is a guess, based on some rare testing on a beta farcry patch results that is no longer listed at another web site forum)

mica
 
The thing I don't understand is why Nvidia is already moving ahead with PS 3.0 and still CANNOT fully support PS 2.0... Even if they are trying to move towards the future, isn't it just as important to support the current features fully?

I have been watching this and many other posts about some VERY touchy subject (as well as the brilinear/trilinear stuff). I'm not trying to start the flamewar again, but it kind of confuses me that NVidia does not want to fully establish a PS 2.0 setup before moving onto PS 3.0.

I thought the whole idea to producing successful products was to hit the nail on the head current technology and establish a good foundation on that the current technology. After doing so, take a LOOONG look at it. THEN start working on the hardware to support new features. From what I have seen here and everywhere else, it sounds like NVIDIA is setting up a mansion on a foundation of straw.

That's NOT going to be a smart way to create hardware people will be willing to buy.
 
dark_15 said:
The thing I don't understand is why Nvidia is already moving ahead with PS 3.0 and still CANNOT fully support PS 2.0... Even if they are trying to move towards the future, isn't it just as important to support the current features fully?

I have been watching this and many other posts about some VERY touchy subject (as well as the brilinear/trilinear stuff). I'm not trying to start the flamewar again, but it kind of confuses me that NVidia does not want to fully establish a PS 2.0 setup before moving onto PS 3.0.

I thought the whole idea to producing successful products was to hit the nail on the head current technology and establish a good foundation on that the current technology. After doing so, take a LOOONG look at it. THEN start working on the hardware to support new features. From what I have seen here and everywhere else, it sounds like NVIDIA is setting up a mansion on a foundation of straw.

That's NOT going to be a smart way to create hardware people will be willing to buy.

nvidia has fully supported beyond ps 2.0 since the FX.
 
The only "issue" I know of regarding SM2.0 support and the NV30 (and NV40) is that it won't complete all ShaderMark tests. However, ShaderMark makes use of a floating point format that is not required by spec to have (spec is 8 bit integer amazingly), and which nVidia currently does not support. It's not a problem of supporting SM2.0 spec.

XbitLabs: http://www.xbitlabs.com/articles/video/print/fx5700ultra-9600xt.html said:
According to Thomas Bruckschlegel from Tommti-Systems – the developers of ShaderMark, NVIDIA’s GeForce FX-series does not support floating point textures and render targets under DirectX 9. Even though one of the company’s Developer Relations officers said that NVIDIA would add support for these capabilities into “future drivers”, floating point textures and render targets still do not function under DirectX 9.

Firing Squad: http://www.firingsquad.com/print_article.asp?current_section=Features&fs_article_id=1371 said:
Tim Sweeney: The DirectX9 spec defines a core feature set that all implementations must have, plus a set of optional capabilities. Unfortunately, the set of optional capabilities ranges from multiple render target to floating-point precision (some vendors use 24-bit, others use 32-bit, but potentially a vendor with some bizarre 19-1/2 bit format could claim DirectX9 compliance).


Besides, do you really think ATI would have let nVidia claim their card to be fully SM2.0 compatible if it wernt? You KNOW they'd have a field day with that :D
JigPu
 
So I was right in a way??? Or did I miss the mark completely?? I remember reading something from x-bit about it... but then again I didn't build the NVIDIA Cards either...
 
dark_15 said:
So I was right in a way??? Or did I miss the mark completely?? I remember reading something from x-bit about it... but then again I didn't build the NVIDIA Cards either...

what JigPu is trying to state is that just because the 5900/6800 cards can't do the tests because of floting points were used, doesn't mean that it is not compliant.

mica
 
The NV3x series can do enough of the DX9 spec and PS2.0 to be called compliant . the problem is that they left out several optional features which are now being used or are about to be used such as high dynamic range lighting and centroid sampling to name 2 . But what is a bigger problem is the fact that many of the features which the NV3xx series can handle it cannot use at a reasoonable speed at all ! So in several game situations NV3x dx9 cards have to stick to dx8 features or lower the precision or both .
 
Back