• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia Caught Cheating AGAIN?!?!?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

LoneWolf121188

Member
Joined
Dec 31, 2004
Location
Osan AB, South Korea
The Inquirer said:
WE CALL IT optimisation these days, but what do you call it when a firm over optimises? We used to call it something different in the days of the NV30, when Nvidia had to do something to make its chip work better and do better in benchmarks but to our surprise it seems to have done it again. The chaps from 3D Center, a very talented in-depth site, spotted and tested and proved that Nvidia is using a lower anisotropic filtering quality than any other card available.
Those guys noticed a texture shimmering problem when you are using normal driver settings. This was the case with NV40 cards but you could resolve this flickering by using high quality driver settings. This won't work on G70 based cards, so the guys well known for its thorough benchmarks went digging a little deeper into the chip.

It turns out that Nvidia is not doing anisotropic filtering the way it should and that the picture quality is the one to suffer. You will get the shimmering effect on your textures whenever you are using Geforce 7800GTX cards but you won't see this using Radeon cards.

The guys claimed that all NV40 and G70 cards will suffer from the same flickering problem and that these cards have "by far worse AF quality". They also add that Nvidia got the flickering because it was using general under sampling and as a result is getting the flickers. It's interesting to note that older Geforce 5800 Ultra won't suffer from this, just the new cards that 6800 or 7800 based.

Another German web site Computerbase , went a step further. It made a custom driver by changing the inf, where the driver could not recognise 7800GTX and use its optimisations. The card was listed as unknown but was working just fine. But when the guys went testing they noticed a massive performance drop when using those drivers, close to 30 percent and related it to anisotropic filtering. Nvidia has a lot to explain.

3DCenter, for the original article, is here in English while the Computerbase German article is here. We will ask Nvidia what is going on but we think there's something up. At lease the guys proved it isn’t a hardware bug - it's a driver problem only but performance drops dramatically as soon as you resolve it. µ

http://www.theinquirer.net/?article=25807

nvidia...c'mon guys, don't do this AGAIN...

What's your take?

Theres a follow up article here.
 
NVIDIA WAS very fast to react to a story we pixellated yesterday. We wrote here about Nvidia's over optimisation. Well, as expected Nvidia said it's a bug and it could not be fixed by changing to high quality driver setting, which was the case with Geforce 6 series cards
So nVidia already has a new beta out to fix it. That was fast. Strangely fast...

The FX series should prove that nVidia knows how to implement AF, and the new driver proves that they knew exactly what to do to fix it. They must have, to fix it so quickly. From that evidence I think it's pretty clear that this was no accident.

Pathetic. Optimizations are one thing, but at the cost of image quality...oh well. It's nVidia. I'm not surprised.
 
johan851 said:
Pathetic. Optimizations are one thing, but at the cost of image quality...oh well. It's nVidia. I'm not surprised.
Agreed, though not quite with that last bit. However, if they keep pulling stuff like this...I've always been an nvidia fan, but the canadians up north are starting to look tempting...
 
if people didnt care, then Nvidia just made their cards faster

stop nitpicking
 
do people have nothing better to do with their lives!?!
notice how no one else noticed beforehand... OR CARED?
why does it matter people...
(dont call me a fanboy...i have an ATI card, i just go with whats good/cheap)
 
but if they were very happy before, what difference does this "cheap af" make?
if it works that well i'd want it on any card i buy
 
but if they were very happy before, what difference does this "cheap af" make?
if it works that well i'd want it on any card i buy
It's the principle of it. If ATi's cards are rendering scenes at full quality, and nVidia's cards can't (or rather, the drivers aren't) render with the same quality, then every benchmark comparing the two is inherently flawed and unfair to ATi.
 
Back