The biggest problem facing the 2007 box doesn’t lie with the CPU, or mobo, or memory. It doesn’t even lie with Vista, which is really saying something.
No, it lies with the next generation of video cards.
About a month ago, we first heard that the next generation of video cards were going to consume about twice the power.
After that, things went quiet for a while, punctuated only by announced delays and reloaded versions of the current video generation.
Now the rumors are back, and even the most optimistic numbers aren’t down very much. Mind you, overclock the thing, and the power monster just grows.
People use the term “FUD” as in “Fear, Uncertainty and Doubt” a lot. Usually they misuse it to describe news about one of “their” products which they don’t like.
This time, though, it’s the real thing. If this wattage estimates are true, this will be the video equivalent of Prescott, and will make this the most expensive video cards on record, even if the price doesn’t change.
Why do I speak in riddles? Well, besides the cost of the card, one is going to have to spend more to supply the power, spend more for the power, then spend even more to try to get rid of the heat caused by the power. It’s going to be like running your furnace and your air conditioner on high at the same time.
And they want us to buy TWO of them? Imagine what this will do for SLI. I’m sorry, but I just don’t like molten messes on my motherboard.
Yes, of course, there are solutions. You can buy a power supply that brownouts the rest of your place, turn your box into a wind tunnel, or give up on air cooling.
The reality, though, is that a lot of people would rather give up on PC gaming than go through these kinds of steps.
To add insult to injury, to run these cards at full capacity (i.e., run DX10 games), you’ll have to buy Vista along with twice the RAM you’d buy with an XP system.
I think this is going to be one insult too many for too many people. All this is going to sell a lot of . . . consoles.
Let’s not even get into “You really need a quad-core processor for $1,000, or two 90nm 125-watt dual cores for $2,000, even though it doesn’t do squat for gaming.”
Completely unknown at the moment is just what you’re going to get for your extra 100 watts. Why the sudden jump? Is this just regular evolution, or will DX10 really make video cards work extra hard for a living?
If these cards don’t offer dramatic improvements (and I don’t mean just for DX10) over the current generation of cards, these products aren’t going to cut it.
For sure, plenty, at least initially, will buy these things without a care in the world, and outside of the handful buying luxury boxes, they will indeed be the pioneers getting the arrows in the back. They’re going to find out what an extra one or several hundred watts of heat will do in a box.
And then people are going to have to decide, “Do I really want to do this, wait until things get more reasonable, or just buy a console?”
I’m afraid a lot of people are going to come up with the “wrong” answer.