You don’t often see an article on CPU cooling in a regular newsmagazine like The Economist, but the current issue does just that (sorry, the article is only available online to subscribers).
The reason for that is heat is going to put a big crimp in future advances if the beast doesn’t get handled.
As illustrated in this Intel graph (page 8), things just can’t keep going to way they’ve been going:
You have two possibilities:
Increase The Cooling
There’s a team at Hewlett Packard called the Cool Team which research how to keep future computers and data centers cool.
At the PC level, they’ve been looking at trying to cool components radiating heat at 200 watts per square centimeter (that’s 100 sq. mm), or roughly double what Athlons are putting out at the moment.
They’ve found that copper heatsinks can’t cut it at those levels, they just can’t radiate heat away fast enough.
They’ve looked at heat pipes, and they don’t quite cut it, either.
Heat pipes are devices that act like mini-water cooling systems. They absorb heat, boil, head up to the cooler part of the pipe, cool and return to liquid, then do it all over again.
The problem with heat pipes is that the liquid tends to form bubbles and thus doesn’t cool evenly.
What the HP folks are doing at the moment is to use inkjet heads combined with a heat monitoring system which squirts cooling fluid right onto the chip, squirting more where it is need most (presumably in a sealed environment). In short, robnpeeing on the CPU. 🙂
Other folks are exploring different methods, in fact, they just finished a conference on this. (We plan to get a copy of the procedings when they become available.)
Of course, in the long run, these are just temporary expedients. The real answer lies in . . . .
Decreasing the Heat
The Intel presentation linked above brings up a number of issues you might not think about when you think about CPU heat.
A new unwelcome threat on top of all the “normal” ones caused by cramming more and more transistors into less and less space is that today’s CPUs now have leaky transistors. They don’t leak water, or silicon. They leak power.
As the insulating layer between the gate that controls electrical current and the electrical channel gets down to a handful of layers of silicon atoms, it’s hard to completely turn a transistor off. This wastes power.
Using current technology, leaking becomes a major concern at 130nm, and will become prohibitive within two-three generations.
Using new elements to replace silicon as an insulator are one possibility, and Intel thinks its strained silicon (actually a silicon/germanium combo) will reduce leakage as a fringe benefit (as would SOI for AMD), but one being more actively pursued by all parties are “vertical transistors”. By isolating transistors physically by putting them above the silicon wafer, leakage can be minimized.
If you look at this fairly recent Intel presentation, it looks like Intel feels that 100 watts is going to be the limit for desktop processors, and they’re going to have to design within that limitation.
The primary means by which Intel plans to achieve this is to increase the efficiency of processing by multi-processing. The first step will be virtual multiprocessing (i.e. hyperthreading) followed by multiple processors in a single CPU core. We’ll probably see that about 2010 or so.
I’d like to say what AMD is planning, but they don’t talk about this very much. I can tell you they’re doing SOI with Hammer, and they’re also pursuing vertical transistors.
No matter what the company, though, heat will shortly be driving the CPU design rather than CPU design driving up the heat.