• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia chaeting with #'s on the 7800GTX

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
AntmanMike said:
Why would I have links dating back that far?
You don't remember the 'ATI Quality Decrease during Games' scandal?

http://www.xbitlabs.com/news/video/display/20030526040035.html

Nothing like going back over 2 years.

Both ATI and nVidia have done it and NV still does it lol. Their drivers are so heavily and specfically optimized for score in 3DM03 it is ridiculess.

You can easily see that in the skew between the 3DM03 scores and the 3DM05 / AquaMark scores.

It amazes me the NV drivers are still defaulting to the lower "Quality" IQ setting for the high end cards as well. One of the reasons for buying these high end cards in the first place is so you can run the Highest Quality driver IQ settings and then add tons of AA and Aniso to that.

Viper
 
Excellent job 'compressing' a single reply into not one, but two replies!

My point is that nVidia is not the only one who 'cheats', although I hardly call it cheating, in my opinion, it is a feature, although I suppose it depents on point of view. I personally would call it 'Dynamic Acceleration'.

I am also highly confident that ATI has their own cheats still that just have not been detected. I can guarantee you that ATI's drivers have compile-time optimization for almost all processor formats, and they most likely have shortcuts inside the code itself for runtime during benchmarking. Most likely more intelligent then the code they used to run that checked name, as well. There are other methods (CRC, MD5, simply checking the method the code runs...)
 
AntmanMike said:
Excellent job 'compressing' a single reply into not one, but two replies!

My point is that nVidia is not the only one who 'cheats', although I hardly call it cheating, in my opinion, it is a feature, although I suppose it depents on point of view. I personally would call it 'Dynamic Acceleration'.

I am also highly confident that ATI has their own cheats still that just have not been detected. I can guarantee you that ATI's drivers have compile-time optimization for almost all processor formats, and they most likely have shortcuts inside the code itself for runtime during benchmarking. Most likely more intelligent then the code they used to run that checked name, as well. There are other methods (CRC, MD5, simply checking the method the code runs...)

That was then not now. Does ATI optimize and improve their driver...Yes just as nVidia does. That is what new drivers releases are all about. The difference is ATI optimization are general and will apply to all applications and games. They are not targeted at a specific benchmark.

You can can it "Dynamic Acceleration" if it makes you happy. I know it would get top ratings from NV's Marketing Department for spinning what is at best a driver bug from a screw up in driver code that sets the core clock generator when the card enters HP 3D mode.

Viper
 
How do you KNOW that ATI does not sneak benchmark optimizations into their drivers? Have you ever seen the source? Have you run it through a decompiler and scanned the assembly? (I assume you are not an ATI employee)
How do you know that the drivers do not only overclock the card if it is determined to be safe? Has anyone purpotedly have had the card crash because of the overclocking? It can easily be called a feature instead of a cheat.

As several people have already said, all this is is ATI Fanboy Propoganda.

Edit: Nice Radeon cooling mods, by the way. They look very hefty, if just a whee-bit overkill :).

Edit 2: Of course, it is possible I am wrong. I am not saying I am the all-powerful video card god, which I am not. However, if you can show to me that these 'cheats' or optimizations that nVidia utilizes in any way destabilize the system, I would love to see it, and in which case I will reverse my position.
 
dude. some people enjoy talking about government conspiracies but most don't because while it's obviously feasable you cannot prove anything. this thread is about an identified, real, cheat. the reason we know about it is the same reason we know that ati cannot be doing it, by monitoring the clocks.

do video card makers optimize their drivers? of course! do they go too far sometimes? sure, and we have "scandals." this is something totally new though, and believe it or not it's not a figment of "fanboy's" imaginiations, it's real and anyone with a 7800gtx and the new drivers can prove it. there is no opinion, no discussion. it's there. while you may like the cheat, that does not prevent it from being a cheat. it was created to make the card look that much better in benchmarks. so, enjoy it, but it's not a god damn conspiracy!

features are advertised and marketed, put on the box and talked about. the mere fact that this JUST came up proves it is a cheat.
 
AntmanMike said:
How do you KNOW that ATI does not sneak benchmark optimizations into their drivers? Have you ever seen the source? Have you run it through a decompiler and scanned the assembly? (I assume you are not an ATI employee)
How do you know that the drivers do not only overclock the card if it is determined to be safe? Has anyone purpotedly have had the card crash because of the overclocking? It can easily be called a feature instead of a cheat.

As several people have already said, all this is is ATI Fanboy Propoganda.

Edit: Nice Radeon cooling mods, by the way. They look very hefty, if just a whee-bit overkill :).

Edit 2: Of course, it is possible I am wrong. I am not saying I am the all-powerful video card god, which I am not. However, if you can show to me that these 'cheats' or optimizations that nVidia utilizes in any way destabilize the system, I would love to see it, and in which case I will reverse my position.

ATI swore off the benchmark specific optimizations after the 2003 incidence. It is well known they no longer optimize there driver code for specific benchmarks. They simply do not need to lol! I doubt you would be believe me if I handed you the white papers lol.

We are not talking about deliberate and automatic over clocking here. Both ATI and NV have the feature, based on temperature built into their high end cards and drivers. That feature can be diabled in both the ATI and NV drivers too.

You can see what the true clocks are at anytime doing anything using RivaTuner background clock graphing function. That is how you know lol.

What we are talking about is a core clock that is being set 40Mhz higher than indicted across the board and behind the users or reviewers back. It does not disable when you disable all overclocking in the NV drivers. It is a constant 40Mhz upward shelf in the HP 3D core clock from what the bios is programmed for or what you set manually. Temperature has no effect on the amount of the upward shelf.

The only time it goes away is if the card drops into LP 3D mode or 2D mode where no upward and hidden clock shelving is seen. Funny how the shelving only occurs in HP 3D mode right where end users or reviewers would be running the card for benchmarking purposes lol.

The difference between the true advertised 430 HP 3D core clock and the hidden, 40Mhz upward shelved HP 3D core clock (469.5 true) is about 400 points in 3DM05 or 5.5%. That is chucky to a reviewer or prospective buyer.

NV has a pretty good card here in the 7800 so why they are screwing with a hidden core clock increase in beyond me. Just fix the drivers, up the bios clock then sell the cards as 470/600's and be done with it. Of course that may screw up future NV marketing plans and would certaining put a ding is BFG's overclocked series of 7800's.

The BFG's are advertised as 450/650 cards. They come out of the with a 460 bios HP 3D core clock and actually run at 501Mhz core in HP 3D mode. The problem those cards have is the first 05.70.02.11.22 bios only ran the memory at 600Mhz instead of 650Mhz as BFG advertised. In order to get the 650Mhz memory clock you have to flash to the updated 05.70.02.11.25 which was released 5 days after the .22 bios to correct the low memory clock issue.

I do not begin to know why you brought up system stability. That isn't an issue and never has been in the context of this discussion.

Yes I am an ATI man but if they shelved their clocks up behind the users back I would blow the whissle on that too. It is deceiving to the buyer be it in their favor or not.

I have no problem telling people that some ATI cards are generally crap when it comes to OC potential. The x800Pro VIVO AGP cards are a good example. Since the advent of the 500/500 x800XT AGP cards the core going on the Pro Vivo's are the bottom of the barrel unit that can't speed bin for either XT-PE or XT use.

I just call em as I find em lol.

Viper
 
AntmanMike said:
Does it increase the clock in any other applications?

You get an 40Mhz upward shelf in the core clock from what even is programmed in bios or what you manually set the HP 3D core clock to across the board when the card goes into HP 3D mode. You can not prevent it unless you manually underclock the card to compensate for the 40Mhz upward shelving.

Viper
 
Ok you guys obviously know a hell of alot more then me but I've read and understood everything you guys been saying and to sum it up.

Basicly there is no way that Nvidia could have missed that. They would of known it was there whether it was a accedent in the coding (or alittle something they came up with) And yes it gives them an advantage in the benches. The real issue with this is that nvidia didnt tell anyone about it and that was possiably (well definitly) a mistake. While looking at the concept (correct me if im wrong) I like the idea of the card getting this boost when entering 3D mode and think its something that maybe utilized in future cards.

Im not going to lie Im not an ATI nor Nvidia fan boy and I agree (sorry forgot to quote and your name but u know who you are) Like I was saying I agree with the person who said that they go for the best perfomance card at the time.

This is alittle "dishonest" of nvidia. They must of known about it. But lets not bash them because of it. Marketing ploy or them just ignoring the issue. They did a bad thing slap on the wrists now lets focus on the card. We are all informed about the benches now or soon will be and I know the marketing departments view popular forums like this about their products. Lets hope some one from nvidia read this and come out of the closet about the "x" speed + 40 Mhz boost (as I'll refer to it)

Simply saying (your getting more for your money) is just a silly comment because we all know how important the bench marks are. Anyone who knows anything about computers always does research on there product before buying it and benchmarks are a popular way of doing it. So yes its ganna give them the advantage. And its alittle dirty. But trust me because of this nvidia is ganna get enough mudd slung at them.

Lets look at the card its self.... VP what are the temps and other performances of the card like on a whole. I've taken a look on the nvidia page and seen alittle bit of it but im sure you could inform me better if you didnt mind doing so. I'd like to learn alot more about Gfx cards because in honestly I know nothing compared to you guys... Im just using my logic.
-lethil
 
Well, if it ONLY increases the clock in a benchmark, it is a cheat, however, if it increases the clock in all 3d apps, it could either be a bug in the drivers (I assume nVidia writes their drivers in Assembly, which could get messy), or a performance enhancer which was improperly written.
 
AntmanMike said:
Well, if it ONLY increases the clock in a benchmark, it is a cheat, however, if it increases the clock in all 3d apps, it could either be a bug in the drivers (I assume nVidia writes their drivers in Assembly, which could get messy), or a performance enhancer which was improperly written.

It is across the board whenever the card enters HP 3D mode.

Viper
 
lethil said:
Ok you guys obviously know a hell of alot more then me but I've read and understood everything you guys been saying and to sum it up.

Basicly there is no way that Nvidia could have missed that. They would of known it was there whether it was a accedent in the coding (or alittle something they came up with) And yes it gives them an advantage in the benches. The real issue with this is that nvidia didnt tell anyone about it and that was possiably (well definitly) a mistake. While looking at the concept (correct me if im wrong) I like the idea of the card getting this boost when entering 3D mode and think its something that maybe utilized in future cards.

Im not going to lie Im not an ATI nor Nvidia fan boy and I agree (sorry forgot to quote and your name but u know who you are) Like I was saying I agree with the person who said that they go for the best perfomance card at the time.

This is alittle "dishonest" of nvidia. They must of known about it. But lets not bash them because of it. Marketing ploy or them just ignoring the issue. They did a bad thing slap on the wrists now lets focus on the card. We are all informed about the benches now or soon will be and I know the marketing departments view popular forums like this about their products. Lets hope some one from nvidia read this and come out of the closet about the "x" speed + 40 Mhz boost (as I'll refer to it)

Simply saying (your getting more for your money) is just a silly comment because we all know how important the bench marks are. Anyone who knows anything about computers always does research on there product before buying it and benchmarks are a popular way of doing it. So yes its ganna give them the advantage. And its alittle dirty. But trust me because of this nvidia is ganna get enough mudd slung at them.

Lets look at the card its self.... VP what are the temps and other performances of the card like on a whole. I've taken a look on the nvidia page and seen alittle bit of it but im sure you could inform me better if you didnt mind doing so. I'd like to learn alot more about Gfx cards because in honestly I know nothing compared to you guys... Im just using my logic.
-lethil

You would be amazed at what can get by the programmers but I agree it would have been hard to miss lol.

When clocked a true advertised 430/600 and the drivers bumped up to High Quality where they should be the card scores 7235 in my test rig. That is what any good 16 pipe x850 card will do stock OC'ed.

At 469/600 it 7610. At 501/652 which is where this BFG OC'ed series runs (that is with a 460 HP 3D core clock setting so you have a 41Mhz upward shelf here) the card ran 8186 in 3DM05 in my test rig.

I just shipped an x850 that ran 8129 under the same conditions but that was one rare card and TEC cooled. The same card is the record holder for x850XT's FutureMark with 8613.

Note I haven't even come close to maxing out this 7800's stock OC nor have I tried. The card will go on water tomorrow and we will see how it does there.

Viper
 
ShadowClock59 said:
Its not much of a bother to me, if people like oppainter can overclock there cores 200mhz over stock, and an extra 30 still gets put on, Id say its a pretty damn stable card.

All I care about are the numbers that come up after I run a benchmark, and im not talking about the clock speed.

Lets leave the dual cascade cooled suicide cards out of the picture and deal with real 24/7 cards that people use on a daily basis.

Nobody ever said the G70 didn't OC well. Just that the HP 3D core clock is running higher than it should be and without your knowledge skewing the numbers you used in making up your mind to buy the card in the first place.

Viper
 
ViperJohn said:
Lets leave the dual cascade cooled suicide cards out of the picture and deal with real 24/7 cards that people use on a daily basis.

Nobody ever said the G70 didn't OC well. Just that the HP 3D core clock is running higher than it should be and without your knowledge skewing the numbers you used in making up your mind to buy the card in the first place.

Viper

Just thought Id ask one thing, I dont know how much you know about the card and about its bios and architecture(or marchitecture), but did you see anything that would enable it to unlock 8 more pipelines as many people are thinking it can do, or is it for certain 24pipes and thats it? I figured if they skew the core clock, they mightve done this, like they did with the 6800nu.
 
ShadowClock59 said:
Just thought Id ask one thing, I dont know how much you know about the card and about its bios and architecture(or marchitecture), but did you see anything that would enable it to unlock 8 more pipelines as many people are thinking it can do, or is it for certain 24pipes and thats it? I figured if they skew the core clock, they mightve done this, like they did with the 6800nu.

Well it is pure speculation but I think it would have been nuts on NV's part to not have designed the core with 32 pipes and release the card initially with only 24 enabled. It makes it very easy to re-spin a new high end card in a few months when they do that.

If they actually do exist and how hard they are disabled is any bodies guess at the point. ATI learned their lesson with the 9500/9700 cards and another with the x800Pro VIVO AGP card. The unused pipe quads are now being hard and permanantly disabled on die at the FAB now. It wouldn't surprise me if NV did something similar as well. Just plain old fashion good business lol.

Viper
 
What you said makes a lot of sense if they did make they 32 pipes there has to be a way to get them to work now the form whatever they are to 32 right.
 
Is it safe to assume that once they catch any heat for it they will release an update calling it a mistake?(highly unlikely)
 
Last edited:
mx101 said:
What you said makes a lot of sense if they did make they 32 pipes there has to be a way to get them to work now the form whatever they are to 32 right.

Not necessarily. If the pipe quads are hard disable on die at the FAB they can never be enabled.

Both ATI and nVidia have benn burned in past where the unused pipes in the die of lower end (read cheaper) models could be enabled. That cost both of them both sales on the higher end models which cost them both money at the bottom line which ****es off investors. Like ATI I seriously doubt NV would make the same mistake twice.

Permanantly disabling the unused/unspec'ed pipe quads can be done in a heartbeat during test and speed binning at the FAB. It just makes good business sense to do so especially since if they do exist it is probably for the purpose of a re-spinning a new higher end model down the road from the same die. Both ATI and NV need to get a higher $ return on the core developement costs invested and the costs to setup the FAB(s) to produce them. A re-spin accomplishes both.

Like always time will tell lol.

Viper
 
Back