• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What nVidia has to say about its... bad HL2 performance:

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Fever

Member
Joined
Sep 16, 2002
Location
Montreal, Canada
Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.

During the entire development of Half-Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed.

We're confused as to why Valve chose to use Release. 45 (Rel. 45), because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware.

Regarding the Half-Life 2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half-Life 2 and other new games are included in our Rel.50 drivers, which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built. It includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers.

Pending detailed information from Valve, we are unaware of any issues with Rel. 50 and the drop of Half-Life 2 that we have. The drop of Half-Life 2 that we currently have is more than 2 weeks old. It is not a cheat or an over optimization. Our current drop of Half-Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

The optimal code path for ATI and NVIDIA GPUs is different, so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers.

In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half-Life 2.


Hrmm... We'll see I guess.
 
We're confused as to why Valve chose to use Release. 45 (Rel. 45),

wow, Nvidias a quick one...i mean........ggeeeeee maybe it was the encoded cheats they've discovered already......best expeirence possibly my A$$.
 
I wonder if it has anything to do with the amount of millions ATi has pumped into Valve...

As for Valve saying they won't use beta drivers in a beta game. Yeah, whatever.

Hopefully Det50 will contain more overall optimisation rather than *.exe specific. I don't care about internal precision as both card only output 8-bits in the end(despite having 10-bit RAMDACs). Anything over 8-bits more than good enough.
 
HotKoala said:
I don't care about internal precision as both card only output 8-bits in the end(despite having 10-bit RAMDACs). Anything over 8-bits more than good enough.

go play an old NES console, and tell me if 8 bit precision is enough

if you dont have access to one, change your display color quality to 256 colors, and tell me you dont care about internal precision

and this is only in regards to color precision which you seem to be referring to...
 
Last edited:
gokickrocks said:


go play an old NES console, and tell me if 8 bit precision is enough

if you dont have access to one, change your display color quality to 256 colors, and tell me you dont care about internal precision

and this is only in regards to color precision which you seem to be referring to...

256 colours is 6-bits on output.

Please do some researching on the internals of videocards before you post.
 
Last edited:
As for the screen capture remark:

A feature requested by Valve and other game developers in the first place. The screen cap takes the image off the back buffer where the anti-aliasing isn't yet applied. nVidia applied a filter on screenshots that is the exact filter applied at runtime so what you see on the screenshot is what you see on the screen.

Valve just happened to twist it against nVidia. *Valve* requested the feature.

Eh, whatever.
 
HotKoala said:


256 colours is 6-bits on output.

Please do some researching on the internals of videocards before you post.
Please enlighten me on this.... For as long as I have known, 256-color was interchangeble with "8-bit color" (technically, this was just a 256 color pallete however... you could choose any 256 of the 16,777,216 [24-bit] possible colors to display on screen at once).

If the RAMDAC was only capable of 8 bits outputted, then wouldn't there be no visual difference between 8, 16, 24, or 32 bit color??? The only way I can read your statement to make sense is if you mean the RAMDAC can only output 8 bits PER COLOR CHANNEL (eg: 8 bits red, 8 bits green, 8 bits blue), meaning there would be no visual difference between 24 and 32 bit color (which I would believe since the extra 8 bits in 32 bit color are alpha [and are 'invisible']).

JigPu
 
-as posted in another thread-

enough with all the conspiracy theories, i think its pretty safe to say that once shaders start being used more frequently, we are just going to keep seeing more of the same, nvidia getting their asses handed to them. lets not forget that ati is getting much better fps while still maintaining better image quality. thats just sad. nvidia messed up bad, and no amount of driver *optimizations*, and i use that word very loosely, is going to overcome their inferior hardware design.

-edit-
and by the way, it really wouldnt make much sense for valve to make the game perform so much better on ati, because most of the market still uses nvidia, get a clue. :rolleyes:
 
JigPu said:

Please enlighten me on this.... For as long as I have known, 256-color was interchangeble with "8-bit color" (technically, this was just a 256 color pallete however... you could choose any 256 of the 16,777,216 [24-bit] possible colors to display on screen at once).

If the RAMDAC was only capable of 8 bits outputted, then wouldn't there be no visual difference between 8, 16, 24, or 32 bit color??? The only way I can read your statement to make sense is if you mean the RAMDAC can only output 8 bits PER COLOR CHANNEL (eg: 8 bits red, 8 bits green, 8 bits blue), meaning there would be no visual difference between 24 and 32 bit color (which I would believe since the extra 8 bits in 32 bit color are alpha [and are 'invisible']).

JigPu

256 bits is 8-bit color (2^8 = 256). 3 bits are reserved for red, 3 for green, and 2 for blue.

HotKoala was referring to modern cards outputting in 8 bits per color. That makes 8 for red, 8 for green, and 8 for blue, giving a total of 24 bits, or 2^24=16,777,216 colors. Slight difference there ;)

That being said, high internal precision is actually a useful thing since often large amounts of shader operations are done on one pixel; while 24 bits seems precise, the errors add up and the card outputs a color that is fairly far from the intended color. High (24/32 bits per color) precision minimizes the errors.
 
how much will the 9800/halflife bundle cost?$350-$400?does anyone know how much money the vavle/ati deal is for?millions im sure.so if thats the case it makes perfect(business) sense to do just what their doing.think about how many people have been waiting for dx9 games to ship before buying a new video card and their numbers play right into that.wouldn't you do your best to make your business partner look better than their competiter(sp?)
 
I don't know, when I start to see all this hype it makes me wonder if the game isn't going to s**k; re: The Matrix, UNreal 2, etc. Might be pretty but if the gameplay is nowhere, what's the point? I wish they (the developers) would stay out of the business end and concentrate on the development end! Funny how some of the better games lately have come out of nowhere?
 
Based on what I saw in the videos, I'm seriously considering buying HL2 (with help from my brother and possibly friends).... And this is coming from a person who has only ever seriously considered buying UT2K3. Nothing else is special enough for my extremely limited budget :D

JigPu
 
ColtIce said:
how much will the 9800/halflife bundle cost?$350-$400?does anyone know how much money the vavle/ati deal is for?millions im sure.so if thats the case it makes perfect(business) sense to do just what their doing.think about how many people have been waiting for dx9 games to ship before buying a new video card and their numbers play right into that.wouldn't you do your best to make your business partner look better than their competiter(sp?)

ATi has had Valve products bundled with their cards for years...this is nothing new, and it sure as hell isnt Valve making ATi look better. I remember someone posting about the new Tomb Raider game...its coded in CG (nVidia's proprietary API) and ATi still kicks its *** in that game.
 
i think you misunderstood me .there is no doubt that ati's cards rock and don't need any help to make them look good.do you not think it's even possible for valve to favor ati for business sake?don't get me wrong fx's performance looks pitiful in the HL2 benchmarks.but i stand by my theory that it makes good business sense to have your partners products do better than their competitors .not by cheats but by having the fx cards not look as good as they could
 
Damian said:
Did you see the e3 video? Graphics aren't the only thing that's creating the HL2 hype.
Videos are a long way from game play. If not, then we'd all be engrossed in a quick match of 3DMark03! Again, I renumerate, just because it looks good doesn't mean it's going to play well. Unreal 2 looked good, but the game play was mediocre and unispired at best. They couldn't even come up with an original hook that matched the hallway scene from Unreal. The gimme is where is the playable demo? Less than 6 weeks from scheduled release and not one level is available for beta play? Whassup with that?
Don't get me wrong, I HOPE I'm wrong. I've just gotten excited about one too many games only to list 'em back on Amazon a week after i've got 'em. I need a new keeper!
 
ColtIce said:
i think you misunderstood me .there is no doubt that ati's cards rock and don't need any help to make them look good.do you not think it's even possible for valve to favor ati for business sake?don't get me wrong fx's performance looks pitiful in the HL2 benchmarks.but i stand by my theory that it makes good business sense to have your partners products do better than their competitors .not by cheats but by having the fx cards not look as good as they could

That would make sense IF there were specific ATi instructions in the coding, however Valve has used standard DX 9 instruction (set by Microsoft) for the ENTIRE project. And nVidia is also working with Valve on the project, so its not like nVidia isnt giving Valve money as well.

Even John Carmack has said that nVidia cards perform painfully slow using shaders, so they have written an entire different line for nVidia cards, while ATi cards use the standard
 
mrgreenjeans said:
I don't know, when I start to see all this hype it makes me wonder if the game isn't going to s**k; re: The Matrix, UNreal 2, etc. Might be pretty but if the gameplay is nowhere, what's the point? I wish they (the developers) would stay out of the business end and concentrate on the development end! Funny how some of the better games lately have come out of nowhere?

i seriously doubt H2 is going to come out and be like the Matrix... and i really hope not
 
Back