The Case Against NVidia

Are you paying twice as much for a computer now than you did two years ago?

Do processors cost twice as much? Memory? Hard drives? Of course not. In fact, the price trend is decidedly downward, not up.

Except for video cards.

If Intel doubled the price of its CPUs, would you like that? AMD? Anybody else?

So why is NVidia unique?

Sure, the equipment has gotten better. So have the CPUs and memory and hard drives.

So why is NVidia unique?

To most of you, it probably doesn’t matter if NVidia charges $700 or $7,000 or $7 million initially; you’re not going to pay until it hits your price level.

However, as the price of the top-end equipment goes up and up and up, it’s going to take longer and longer and longer for it to get down to your price level.

Given some level of technological improvement, the prices of some components will go down over time, but have you considered how the same card selling for $500 can cost just $200 a year later?

Let me put it this way;
Price shifts like that are rarely if ever due to the actual cost of materials in the computer industry, simply because we are dealing with more-or-less organized sand.

In accounting terms, computer component manufacture is usually a high fixed-cost, low variable-cost industry. In English, it’s the equipment that costs the big bucks, not the actual making of the product.

When that is the situation, you have two possible pricing strategies:

If you want to make a lot of money, you set the price high and try to amortize the cost of the equipment on the backs of the early buyers. After the equipment’s been paid for, you’ve got the ability to lower the price quite a bit, but you try to milk it for all it’s worth.

Or you can try to spread out those amortization costs among a whole lot of buyers by setting the price relatively low and making up in volume what you lose in profit-per-sale.

The first approach is pretty much what Intel does. The second is what AMD is doing now. NVidia is clearly trying to follow the Intel pattern.

What are you getting for your money?

The difference between a 32Mb DDR and 64Mb DDR GeForce2 GTS card looks to be about $100. $100 is quite a bit for 32Mb of RAM. Yes, I realize this is higher-speed RAM than even PC2100, and I’m going to poke around and find out just how much it does cost, but I suspect a fat chunk of that $100 is profit.

Let’s look at a 64Mb Ultra. Same amount of RAM, just a bit faster. Same essential GPU, just a bit faster. Cost? Not just a bit more. It’s about $200 more. Pretty hard to believe the additional cost comes anywhere near $200.

The GeForce3 will follow the same road: some improvement, much higher price.

Paying a mint for tweaks

Video cards have pretty much hit a bottleneck: memory bandwidth. It’s just not going to get a whole lot faster any time soon. Going from 4ns RAM to 3.8ns RAM is no great leap.

True, GPUs can play more and more tricks to make the most of the situation, just as CPUs have. But there are just so many tricks that can be played, and we have already reached the point of diminishing returns. Might not be so bad if the price remained constant, but you’re paying more and more for those diminishing returns. Yes, even if you have a strict budget, you’re paying more, but more on that in a moment.

Paying a mint for useless tweaks

You might say, “It has new features,” or “It throws out more frames.”

Features are useless if they’re not used. If NVidia has some wiz-gang feature, programmers have to program for it. They’re not ready at day one, or fifty, or even a hundred. If it takes a half a year or a year for games to actually use the feature, what good does it do you until then?

So what is your brag, “I have a great feature I can’t use?” Maybe it is a great feature. So good, buy it when you can actually use it; don’t pay a lot more for it before you can.

Mine Is Six Feet Long, and Eight During Blue Moons Falling On Leap Years

Wanting more FPS makes sense up to the point where it continues to improve the gaming experience. Anything beyond that is a waste. For many games, we are well beyond the point where it makes a difference. If you are playing one of those games, size doesn’t matter anymore.

Movies have somehow managed to exist, and pretty well, running at 24-30 fps for most of a century. If it really made a difference in quality, don’t you think the standard would have been raised by now.

I realize that framerates can differ dramatically in a game. I can see wanting a nice big buffer so that the game runs smoothly even at worst. But you don’t need a 200fps buffer to achieve that.

Can any of you say with a straight face, “I can see a definite difference between 120fps and 80?” If you can’t, then why bother?

Current benchmarking on video cards is absurd. If you benchmark using a high resolution, the video card chokes at a certain point due to fill rate or memory bandwidth bottlenecks. There is no reason to upgrade when that is the bottleneck.

Of course, we can’t have this, so instead we see benchmarking done at absurd resolutions like 512X384, which are often the only places where there is a (meaningless) improvement.

So what do we end up with? “Look at me! I have an improvement I can’t see at a resolution I never use!” It’s like being six feet long; more is not always better. You might get bragging rights within equally deluded circles, but try putting it to use.

Presumed Foolish

What is NVidia saying to you? First, it’s repeating the old Intel swan song, which is:

“We’re going to make loads of money because we know there’s a certain proportion of fools with more shekels than sense who’ll shell out plenty to (often) sustain delusions of grandeur or social status, no matter how absurd. They’re begging to be taken, and we’re taking.”

This frontal assault on wallets is easily rebuffed by showing an empty one, but it does distract most from noticing the important message, which is:

“We will raise the average price paid for a video card throughout our product line.”

Two hundred dollars used to get you a top-of-the-line video card not too much after introduction. Now it gets you third or fourth place. Third or fourth place didn’t cost you $200 two years ago.

You don’t like third or fourth best when you used to be able to get the best for the same, do you? Might make a lot of you eventually decide to lay out another fifty or a hundred to move up a notch or two, doesn’t it?

That’s how this hurts those of you who would never lay out $500 or $700. Let’s face it, a big chunk of the people buying computer equipment aren’t known for patience, and NVidia expects to take advantage of that.

The Eternal Deathmatch: Buyer Vs. Seller

Nvidia has the perfect right to try to charge as much as it likes for its products. We have the equal right to try to pay as little as possible for them. Neither goal by itself is particularly reasonable, but it’s only when the two interact that you get a market.

It’s true that they can only sell what we buy, but if we don’t pay enough, eventually they won’t be anything around for sale.

A free market bounces between those two incompatible extremes. If one side goes too far, the other has to take action to restore the balance. If sellers ask too much, buyers need to refuse to buy. If buyers will pay too little, sellers need to stop selling at that price.

“Something for nothing” and “Nothing for something” are but two side of the same coin. Neither is good.

Email Ed


Be the first to comment

Leave a Reply