Thermal design power for Smithfields (Intel’s first dual-core processors) is supposed to be 130 watts, and this has some folks aflutter.
Personally, I think if Intel can get two sons of Prescott together running at 3.2GHz each at just 65 watts each, that would be quite an achievement given what Daddy does.
I’m more than a little skeptical they can even do that, and yes, AMD has promised to do rather better than that, so we have the paradox of a great achievement than still isn’t good enough.
Nonetheless, regardless of whether it’s AMD or Intel, any 90nm dual core processor is going to be a red-hot item once significantly overclocked.
Realistically, you’re looking at 200 watts or more to be cooled either way if you’re going to push dual-core to anywhere near the levels of current or soon-to-be single cores. Yes, Intel will likely be worse, but AMD won’t be too good, either.
That’s an awful lot to ask from a fan and heatsink, no matter how configured.
Yes, I know AMD is talking about 95 watt dual cores, and I don’t doubt they can do it, do it today, at the speeds intended (i.e., 1.8 GHz, 2.0Ghz). Frankly, low-speed Hammers are probably pretty overvolted for their modest task. A few have told me that they can chop off .2V or a bit more and still run perfectly fine at default speeds.
It’s when you push these chips past a certain point that the wattage figure skyrocket. We spoke about this a while back.
One Ameliorating Factor
One good thing about these dual cores is that while gross heat will go up, heat density won’t. The task will be to cool two 100-watt light bulbs rather than one 200-watt light bulb. It’s still not enviable, but it could be worse.
This at least gives the people trying to cool this an opportunity to take a few different approaches to this challenge. Trying to isolate each core’s heat flow and deal with it separately is one possible option, no doubt some very clever people are working on others.
In the past, we could sit back and say when faced with hot CPUs, “Just wait for a process shrink, and this will all fall back in line.”
We can’t assume that any longer, at least not to the degree we’ve seen in the past. No doubt there are hundreds, maybe even thousands of researchers at places like AMD, IBM or Intel trying to bring back those good old days, but understand that all this effort and expense is being thrown at a problem that didn’t exist before, and solutions are not going to be cost-free.
We’ll just have to see, but the most likely outcome is that we’ll see a lot more struggle for a lot less improvement than we have in the past.
Canaries In The Mines
A long, long time ago, some miners used canaries as carbon monoxide detectors. Canaries have a much higher metabolic respriatory rate than humans, so they are much more suspectible to carbon monoxide (which prevents blood from carrying oxygen), and pass out/die well before humans would.
We overclockers are now the canaries in the CPU mine.
Prescott was the first example of this. There’s nothing terribly wrong with a low-speed Prescott running at default speed by an average Joe. It’s when you try to crank it up that all hell breaks loose. We knew long before Intel announced it that Prescott was a real problem
The same will happen with dual cores. What may be perfectly acceptable to the average Joe with the average box could become hell on earth in our hands.
And like canaries, we are disposable. Nobody at AMD or IBM or Intel is going to get too upset if overclocking their processors is no longer an easy task.
Dual cores are likely to make heat a much bigger bottleneck, at least for air overclockers.