Can you think of a better term to describe what the nVidia FX Ultra has become? I can’t.
Can you think of a previous product like that? The only one I can think of is the PIII 1.13GHz.
They were two very similiar products, indeed.
The Similarities
Both were made by companies a bit too secure and satisfied by themselves sitting on their laurels.
Two companies that didn’t take their competition as seriously as they should have, and got blindsided as a result.
Two companies that found themselves pushing what they had until it broke.
Two companies that then found itself having to withdraw from bleeding edge wars and get their act back together again.
The Differences
Contrary to popular belief, nVidia has never had the kind of market share Intel had. The overall market share wars between nVidia and ATI are more like Coke and Pepsi than Intel and AMD.
nVidia never had and doesn’t today have the brand recognition Intel had and has. If you asked the average computer owner what video card he has in his or her computer, most hes and shes probably couldn’t tell you. It’s hard to have brand loyalty when you don’t even know what you have.
AMD couldn’t have taken over market share lead over Intel back then under any circumstances, they just didn’t have the production capacity. The same can’t be said for ATI.
Finally, the CPU industry doesn’t have the equivalent of the Voodoo Hex in its history: the story of a market-leader collapsing within a few years. The rats have experience leaping off sinking ships, and could jump even faster the second time round.
Do You Learn From Defeat?
Losing really isn’t that big a deal. A loss doesn’t make you a loser; it’s refusing to acknowledge and learn from it that does. That breeds more losses, and a string of losses will kill you.
Intel was really lost for a couple years back in the PIII 1.13GHz era. They managed to survive RDRAM and PIII stalled at 1GHz and Willamette without too much fuss, but then they had a huge cushion to lean on.
It’s not like AMD is doing worse now than Intel was a few years back; I’d say Intel screwed up more.
But AMD didn’t have a lot of cushion to begin with, used most of it up last year, and is less than halfway across the high wire: they still have to get SOI down and migrate to .09.
The high wire act isn’t the problem with AMD; it’s the lack of a safety net below.
nVidia is somewhere between the two extremes. nVidia can survive another six months without a world-beating card, but they can’t take two years figuring out how to make one, either.
Actually, making the FX Ultra abortware is a good sign. At least they can recognize a botched product without too much help. If they really had their heads in nether regions, they just would have pumped them out, which would have been worse than having nothing at all.
That’s a start. Let’s see if nVidia learns more from this, and whether they learn enough.
If they don’t, there’s plenty of 3dfx exemployees around to show them how to jump.
Be the first to comment