• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Can Nvidia redeem themselve with nv40

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

DayUSeX

Member
Joined
Aug 13, 2003
Well what do you all think. I know alot of you feel as I do that nvidia blew it with nv 35. If NV40 is just the same as NV35 or more correctly stated it is the same jump as the GF3 to the GF4 core, then i think NVIDIA is in for alot of trouble. It still is DX9 fundamentally flawed, and no amount of drivers are going to fix that. But if nv40 is an almost totally new core there is hope. I think it will be more like the GF3 to 4 core jump, what do you guys think. Even if the NV40 turns out to be another FX situation so you still have faith in nvidia??
 
Maybe...maybe not. Core designs take rougly 2 years from drawing board to retail shelves. They may have discovered the problems early enough to make the corrections without going back to scratch (but already to late in the designs of the NV30), and that would save them, or maybe they didnt discover the problems until much later.
 
yeah i agree steven but then again ATI cheated, and look at them now, oh well we will just have to wait and see.
 
Steven4563 said:
yeah i know but that was back in the Quake 3 days things have changed alot

No sh** ;) It took a LOT of change for ATi to get where they are today. They made a lot of changes, for the better, obviously, to get on top of the performance market over the span of a few short years. Who's to say nVidia couldn't do the same? Here's for hoping... but not expecting too much! :beer:
 
if you want to put it in an ignorant persons point of view.. then nvidia wasted their time on the nv30 core and all others using the .13nu process.. But if you want to see it as thus, nvidia can do alot with that core. even if it was just to see if thye could do it.

where as they statred from scratch with those cores, with an idea that was not tried much, they just have to make some small changes, or big changes, either way it will be easier for nvidia to make something that can easily be faster. dont forget that nvidia wanted that whole idea of Cg, and i am sure that they based quite a bit of their architecture on that, but only time will tell.

and a note for the really ignorant. of course nvidia cheated... name a company that doesnt make their product look good on a benchmark?
 
lol yueah thsoe g5 benchmarks were horrible, especailyl how they showed thier litttle nvidia card smoking hte radeon 9800xt
 
ATI is no better than Nvidia. ATI cheated on quake3 while nvidia did it on 3dmark03. Both companies have tained reputations because of it. Neither is better than the other.
 
No sh** It took a LOT of change for ATi to get where they are today. They made a lot of changes, for the better, obviously, to get on top of the performance market over the span of a few short years. Who's to say nVidia couldn't do the same? Here's for hoping... but not expecting too much!
Pretty much ditto for me.

Can they? Sure, they came from WAY behind & kicked the crap out of 3Dfx - they can certainly do it again. As mentioned, I hope but don't expect too much - lol.
 
Bailey said:

Pretty much ditto for me.

Can they? Sure, they came from WAY behind & kicked the crap out of 3Dfx - they can certainly do it again. As mentioned, I hope but don't expect too much - lol.

nVidia never came from behind. They came into the industry with a kick *** product and a goal to revolutionize the graphics card industry (which they did with hardware T&L). Since then though, they havent developed anything thats innovative on that scale.
 
nVidia never came from behind.
Sure they did, when they entered the market 3DFx was the dominant force. In fact 3DFx was so dominant that game developers were forced to include 3DFx's proprietary Glide interface in their product.

At the time of the orginal VooDoo Nvidia was an upstart & atleast a distant 3rd or 4th behind Rendition & S3. The Riva product was "okay", but no revolutionary product. IMHO the 1st really good product Nvidia produced was the TnT2. I mean hell, the Riva's could barely do OGL - lol.

What I find funny was at the time Nvidia was known for the superior image quality & 3DFx was known for raw power. Also amusing is that D3D was the "new" kid & the block w/ little support - lol.

At that time the best designed product by far was Rendition's (image quality & speed). Nvidia kicked the stuffing out of the competitors w/ good designs, quick driver updates & a six month new product cycle - nobody else could keep up.
 
Last edited:
Bailey said:

Sure they did, when they entered the market 3DFx was the dominant force. In fact 3DFx was so dominant that game developers were forced to include 3DFx's proprietary Glide interface in their product.

They werent forced to use Glide at all...they chose too. 3Dfx cards could run DX and OGL, which means using Glide was purely optional on the developer's part.

Bailey said:
What I find funny was at the time Nvidia was known for the superior image quality & 3DFx was known for raw power.

I seem to remember 3Dfx having some of the best image quality in the industry, especially when comparing Glide to OGL or DX. I still remember playing Diablo 2 on my Voodoo 3 3000, and the image quality in Glide was far superior to DX or OGL.
 
you know, as much as I hate Nvidia. I hope they win next round. I really hate paying $250 for a 9800. With some hearty competition it will hopefully bring down prices.

Hmmmm but will they? Probably not. *sigh* least I got my 9800 for $200. If Nv loses next round better pony up for $300
 
i think that Nvidia is trying to do what 3dfx was able to do, to get developers to use a proprietary (spelling) API, Cg, and it may actually make full use of the video card. but it was something that they obviously werent too successful at doing.

i am sure that some brand new company will have another super sweet product, and have it be the next nvidia. come in strong, hold up high till a long time competetor can optomize it best... or just be at all the meetings that initially design the next API standard.
 
Steven4563 said:
i lost faith in nvidia once they started cheating :(


Ati cheated first,and still cheats... No one seems to remember that tho for some reason. :-D


Ati really had to do a LOT to catch up to nivida,but Nvidia lost many of its driver designers to ATI, and ATI had done one HELL of a job fixing their drivers, Thus i have purchased my first non-voodoo,non-nvidia card EVER. Competition is the consumers best friend.

Heck - i remember when ATI rage 128 pros burned up monitors because of a bad driver release that set the refresh rate to 90mhz.hehe.

Those days are gone, XGI and Matrox are making a run at nvidia and Ati,so hopefully these guys will stir things up a bit. :)
 
Last edited:
They werent forced to use Glide at all
When I say forced, I mean forced as in if you wanted sell any games you'd better darn well have glide. The orginal 3DFx products did DX/D3D but they sucked. They were "okay" in OGL. They were best when running glide (as most cards where, such as Renditions running RRedline). The VooDoo cards dominated the market, if you wanted to sell lots of games you'd better have glide otherwise V1 owners were going to look elsewhere.
I seem to remember 3Dfx having some of the best image quality in the industry, especially when comparing Glide to OGL or DX. I still remember playing Diablo 2 on my Voodoo 3 3000, and the image quality in Glide was far superior to DX or OGL.
Here's a link to an old review @ Tom's, throughout the ariticle you'll notice how most of the other cards have better image quality in both D3D & OGL. But then at this time folks were happy to get ANY 3d acceleration - lol.

http://www.tomshardware.com/graphic/19980121/3dbench-26.html
 
I always had an objection to using Glide - although it made some neat reflections and other things in certain games, I always HATED (hated, hated, hated!!!) the color banding in 16-bit color. I still retch when I see people playing games in 16-bit color on modern cards - and I do see it happen!!! On Ti4200-series cards, even! Holy image quality loss, Batman. Glide was good for speed, and that's about it - it's limitations as to color depth was far too big a sacrifice on image quality.
 
Back