• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

REAL HL2 benchmark results!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

germanjulian

Member
Joined
Apr 28, 2002
Location
Frankfurt/London
:attn: Real benchmark results with the bench tool provided in CS source tested by a website.


http://www.driverheaven.net/showthread.php?t=53712&page=1&pp=15

12_9_00.JPG


things get worse for nvidia with AA and AF

my results on my system:

1024x768 resolution: All settings at high and WATER REFLECTION = ALL not just characters but also the world:

2AA and 8 AF = 40FPS
2AA and 4AF = 42Fps
2AA and no AF = 45Fps
0AA and no AF = 76Fps
0AA and 8AF = 65Fps.

[pwoned]
 
hahahah what a BS!!!

i run 1600x1200 all high all reflections 6aaa 4af and i had 78fps
what a mother****ing botched test those losers there did.

Just makes me sick looking at that as my runs smooth as hell and real nice :D I wonder how much ati paid them for that lame test...
 
Last edited:
Pitbull2k said:
hahahah what a BS!!!

i run 1600x1200 all high all reflections 6aaa 4af and i had 78fps
what a mother****ing botched test those losers there did.

Just makes me sick looking at that as my runs smooth as hell and real nice :D I wonder how much ati paid them for that lame test...

really. thats good. still I really dont care that much about the nvidia ati thing what I care about is that it runs perfectly on my rig!!!!!
and it costs about half that your did.... and appart from the resolution and some FPS there is no difference. I cannot wait for hl2 now...

cs source will keep me happy (which I play a low res anyway cause its a fps)
 
Im not a fan boy either, see i just got my 6800gt few days ago. And i have a 9700pro lying here that it replaced, and before that i had gf4200, i get what i feel is good. I know this rig will last me a while.
 
Pitbull2k said:
hahahah what a BS!!!

i run 1600x1200 all high all reflections 6aaa 4af and i had 78fps
what a mother****ing botched test those losers there did.

Just makes me sick looking at that as my runs smooth as hell and real nice :D I wonder how much ati paid them for that lame test...
Just curious, what what AA were you really running? There is no "6xAA" on the GeForce cards last I recall; there's no 6xAF setting either...

I'd also like to see a few more diverse benchmarks, as having only one website reporting numbers gives us no real verification of any accuracy. Also, while they are testing the Source engine that will be used to power HL2, they're not testing HL2 maps and textures and characters and effects. It would be like benching Quake3 and telling everyone this is how RTCW will perform.

I wouldn't get too bent out of shape just yet, no matter your card affiliation.
 
Wow. Well Valve said ATI's cards ran it better. Judging by those benchmarks, my Radeon 9700 Pro should run it fine. Nice to know.
 
Uhh, Driverheaven is the least credible source possible for benchmarks.. He was the *ONLY* site on the planet that had ATI scoring much better than Nvidia on Doom3.. What does that tell you? Tells me the guy is an idiot fanboy.

http://www.vr-zone.com/?i=1181

Also, if you look closely.. Note he tested the Nvidia card at 8xS, and the ATI card at 6xAA.. Umm DUH! Big difference there, and I love how he doesn't mention that. The Nvidia mode is 8xAA with Super Sampling... Duh, of course it will be much slower. As such, I call zero credibility in this test.
 
Last edited:
Kobra007 said:
Uhh, Driverheaven is the least credible source possible for benchmarks.. He was the *ONLY* site on the planet that had ATI scoring much better than Nvidia on Doom3.. What does that tell you? Tells me the guy is an idiot fanboy.

http://www.vr-zone.com/?i=1181
Something to consider:

VR Zone's evaluation was done on older version of the Source engine, and was using a demo that was purposely CPU-limited. The physics and particle information on a 24-player deathmatch is not going to bottleneck your GPU... And for further proof, look at thier benchmark examples: the only place where the framerates really ever changed was at 1600x1200. AA was essentially "free" the entire time, simply because it wasn't the video card that was bottlenecking that demo.

Also, if you look closely.. Note he tested the Nvidia card at 8xS, and the ATI card at 6xAA.. Umm DUH! Big difference there, and I love how he doesn't mention that. The Nvidia mode is 8xAA with Super Sampling... Duh, of course it will be much slower. As such, I call zero credibility in this test.
That specific test you pulled was the "maximal" test -- this is the absolute maximum settings that each card could do. For ATI, that's 6xAA and 16xAF and for NVIDIA that's 8xAA and 8xAF. All other tests were done at the same AA/AF levels between cards, and all other tests showed a similar disparity in performance.

This still doesn't mean the numbers are accurate, but for the specific cases you pointed out, your complaints are not valid.
 
Albuquerque said:
Something to consider:

VR Zone's evaluation was done on older version of the Source engine, and was using a demo that was purposely CPU-limited. The physics and particle information on a 24-player deathmatch is not going to bottleneck your GPU... And for further proof, look at thier benchmark examples: the only place where the framerates really ever changed was at 1600x1200. AA was essentially "free" the entire time, simply because it wasn't the video card that was bottlenecking that demo.


That specific test you pulled was the "maximal" test -- this is the absolute maximum settings that each card could do. For ATI, that's 6xAA and 16xAF and for NVIDIA that's 8xAA and 8xAF. All other tests were done at the same AA/AF levels between cards, and all other tests showed a similar disparity in performance.

This still doesn't mean the numbers are accurate, but for the specific cases you pointed out, your complaints are not valid.

Good this means that since people plan on playing the game with people they will not notice a difference in speed until they turn on 8x fsaa and 16x anisotropic?

By the way it is good to see that they optimized the source engine for ATI. It means that they used cheats (others call them optimizations but I am a fan boy and when you refer to optimizations as a fanboy you call them cheats) and Nvidia in the end will look better but sacrifice some performance.

By the way, before you say Doom3 was optimized, keep in mind Carmack has basically said he tried his hardest to make the engine run well on ATI cards but they gave him nothing to work with feature wise which made it virtually impossilble. Not to mention the fact that there is a core flaw that made the x800's run the game very slow.

By the way a question that I have had, what shader path is used for the two cards. From the results I am guessing they have no ps3.0 path on Nvidia but use 2.0b for ATI which would not only explain the difference in speed, but also mean that Valve tried to screw over Nvidia, with a big game like this why would you do that?

I am still going to get half-life because it will be more than playable at 1280x1024 with 4x fsaa and 8x af. And personally I only need a certain level of beauty, the diff from 8x to 4x isn't noticeable to me unless I stop playing the game and stare at the edges with my face pressed to the screen.

/rant off

p.s. Now the only thing I need to figure is how to get Half-Life 2 in past my brother, I need something to play besides Doom3 (not because the game is boring, but rather becuase it was short, and I can't stand playing any FPS through twice) until STALKER and FEAR come out, but for some reason he is convince that Valve is evil and that my 50 dollars will make a difference (which it will. The CEO can now buy his kid a candy bar to go with his Lamborghini!)

/e prepares be forced to fight his brother just play a game because his brother is some type of moron.

p.p.s. I really don't have an agenda for either ATI or Nvidia, I just truly believe that nv40 is a much better for than the r420. And in almost every game besides Doom3 and Halflife I have been wrong and they have been EQUAL, I personally see them both to be completely equal now in all aspects, one game each where they do comclusively better. But of course both are still playable at beautiful levels.
 
L>MS said:
Good this means that since people plan on playing the game with people they will not notice a difference in speed until they turn on 8x fsaa and 16x anisotropic?
On NVIDIA equipment, that's likely not the case. Varying levels of AA were essentially "free" at all resolutions except for 1600x1200. The lower the resolution, the more "free AA" you could apply. Howabout instead you actually read that article you linked to VRZone and understand what it's telling you. Why have me explain a link that you insisted proved your own point?

By the way it is good to see that they optimized the source engine for ATI. It means that they used cheats (others call them optimizations but I am a fan boy and when you refer to optimizations as a fanboy you call them cheats) and Nvidia in the end will look better but sacrifice some performance.
Or perhaps you just don't remember right? NV hardware was forced to partial precision, or even worse, DX8-level code in order to have performance even [inear what ATi's equipment was able to do in full precision. Remember this? Maybe this one? Or this one? Lots of places to show you where NV is being forced to run with different shaders to get around it's huge deficit in performance. Maybe what you meant to say was that Valve was forced to optimize for NV bceause otherwise they'd kill half (or more) of their sales? Just a thought...

By the way, before you say Doom3 was optimized, keep in mind Carmack has basically said he tried his hardest to make the engine run well on ATI cards but they gave him nothing to work with feature wise which made it virtually impossilble. Not to mention the fact that there is a core flaw that made the x800's run the game very slow.
I know that Doom3 wasn't optimized, but I want you to show me just ONE link where Carmack said "he tried his hardest to make the engine run well on ATI cards but they gave him nothing to work with feature wise which made it virtually impossible". In fact, show me anything REMOTELY close to that. Otherwise I call a big huge BS on your fanboyism.

By the way a question that I have had, what shader path is used for the two cards. From the results I am guessing they have no ps3.0 path on Nvidia but use 2.0b for ATI which would not only explain the difference in speed, but also mean that Valve tried to screw over Nvidia, with a big game like this why would you do that?
From previous interviews and talk about this engine, it's all SM2.0. There's no "b", there's no Sm3. All the features they wanted were available in SM2.
 
In that benchmark, the 8Xs is really slow compared to any other AA setting nvidia has. They should've set them both at 4x, which is a far more fair comparison. You're also showing links about Nv's performance in HL2 of the NV3x generation. The NV4x line is a completely different architecture, isn't it? I doubt nvidia will need the same tricks to get speed out of the 6800's as they did with the FX's. I mean, the FX's were just plain half-assed.
 
Chowdy said:
In that benchmark, the 8Xs is really slow compared to any other AA setting nvidia has. They should've set them both at 4x, which is a far more fair comparison. You're also showing links about Nv's performance in HL2 of the NV3x generation. The NV4x line is a completely different architecture, isn't it? I doubt nvidia will need the same tricks to get speed out of the 6800's as they did with the FX's. I mean, the FX's were just plain half-assed.

You bring up some good points, since Nvidia FSAA is superior, he should cap it at 4xAA in his tests to be more accurate. Its simply stupid to put the Nvidia at 8xS then try and compare speeds with ATI set at 6xAA - which its known to have LOUSY AA..

This benchmark is a sham and will be proven as such in short order.

Also, HL2 doesn't look very impressive to me.. Looks like a year old engine already, so I wouldn't be to cheeky about its performance in games. Doom3 is a much better engine - can't wait to see what all of the companies licensing Doom3 engine will come up with. :D Of course, we already know which card will own that. Any good, unbiased tests with HL2, show Nvidia and ATI about even on the benchmarks, this guy at Drivers Heaven is a retard.
 
Chowdy said:
In that benchmark, the 8Xs is really slow compared to any other AA setting nvidia has. They should've set them both at 4x, which is a far more fair comparison.
In that specific slide of the benchmark, absolute maximum settings were being applied to both cards. In the other eight slides of that same benchmark, equal amounts of AA and AF were applied per-card. Please evaluate the entire benchmark set before making up your mind by looking at a single slide.
You're also showing links about Nv's performance in HL2 of the NV3x generation. The NV4x line is a completely different architecture, isn't it? I doubt nvidia will need the same tricks to get speed out of the 6800's as they did with the FX's. I mean, the FX's were just plain half-assed.
First, let me state that I agree with you in the case of the 6800 -- I too believe that NV has likely fixed all the previous generation's shortcomings, and it will likely be proven that it can keep up with ATi's offerings once real HL2 benchmarks come out.

However, tell me this: of the many hundreds of thousands of SM2 or later cards, how many of them are 6800's? How many of them are x800's? The reality is VERY few. I'd guesstimate that 98% of all SM2 or higher parts in the public sector right now are NV3x's and R3x0's. That means GeForce 5200's, 5600's, 5700's, 5800's and 5900's as well as Radeon 9500's, 9600's, 9700's and 9800's are what the significant majority of users have in their machines. And in that grand majority category, ATI performance is far exceeding NV.

Funny enough, the same holds true in Doom 3. While most everyone here at OCForums pays attention to the uber-top end cards (and that's why I read it), you have to remember that nearly all of the public is at least one generation behind what we're drooling over. And in the last generation of cards, ATI is also beating NV in Doom 3 performance too.

Because the significant majority of users out there don't have the absolute newest hardware, that makes those "old" numbers quite valid. That doesn't mean DH made a good review that won't get thoroughly debunked in the near future, but it's certainly something to think about.
 
Any other test i've seen has ATI and Nvidia essentially TIED in Half Life2 benchmarks.. Thats all that matters... Since Nvidia destroys ATI in Doom3, and costs way less, I guess the winner is a no brainer, isn't it? Regardless of the tie in HL2...

Driverheaven guy is a fanboy tard, not the first time hes fabricated tests from what i'm reading...

But thats ok, he decided to ban me for correcting his flaws, so I left a nice suprise for him by replacing the benchmark photos with one more indicating the type of idiot he is.. You can see it on page8 of that thread. Just hit the old refresh key. :clap:
 
show me some prices of NV costing WAY less than ATi. another thing, so because, NV wins in 1 benchmark, D3, and i would hardly call it crushing them, they are the winners hands down? Come on, get a grip man, BOTH cards are great cards, you cant go wrong with either. Stop being so blinded by your fanboyism, that you cant see that either card will play the games just as well as the other.
 
Kobra007 said:
Any other test i've seen has ATI and Nvidia essentially TIED in Half Life2 benchmarks.. Thats all that matters... Since Nvidia destroys ATI in Doom3, and costs way less, I guess the winner is a no brainer, isn't it? Regardless of the tie in HL2...


You obviously have done no research comparing doom 3 performance since ATI's 4.9 Drivers for Doom3 were released.

And even when there were no optimized drivers, how is 2-5% fps difference "destruction"?? Both cards are great cards, both cards essentially work just as well.

God fanboys are annoying. Your preference does not constitute scientific evidence. Do some research before you make blatent comments... please.

EDIT:
Kobra007 said:
But thats ok, he decided to ban me for correcting his flaws, so I left a nice suprise for him by replacing the benchmark photos with one more indicating the type of idiot he is.. You can see it on page8 of that thread. Just hit the old refresh key. :clap:

What the hell is that? Are you bragging about hacking a popular review website that a good deal of us enjoy using?

This is not the proper way to become accepted in this community seeing as this is your first month here...
 
Last edited:
But thats ok, he decided to ban me for correcting his flaws, so I left a nice suprise for him by replacing the benchmark photos with one more indicating the type of idiot he is.. You can see it on page8 of that thread. Just hit the old refresh key.
Maybe another ban is in order. I for one do not hold much regard for such a person as one who would do something along these lines. I can only hope the mods here feel the same.
 
Back