What Is Intel Up To?

The Inquirer says they’ve heard that Intel is putting the word out that they’ve solved their leakage problem at the 45nm level.

“Solved” is a very big word that can cover a lot of ground, so a little some skepticism is in order until we see specifics next week.

Just to give some perspective on the whole leakage issue (from “Feeling the heat,” Economist, March 13, 2003):

PowerLeakage

You can see it wasn’t an issue at all until Intel reached the PIII line (read 180nm). Wasted wattage got up to 15% on higher end PIIIs (and probably was worse at 1GHz or above), then decreased just a bit with the initial (longer-pipelined) Willamettes.

The numbers then exploded with 130nm and Northwood. The graph doesn’t cover 90nm Prescotts, but despite lengthening the pipeline again, and tweaking wherever possible, the percentage of power lost is probably even higher (at least at higher speeds).

It’s probably safe to say current Prescotts leak about half their power, maybe more at the top levels. Given that the thermal guideline for a regular 3.8GHz Prescott is 115W, that means 55-60 watts of wasted power.

How Much And How Is It Done?

What we need to look for at the IDF is how specific Intel’s claims of power reduction are. It’s one thing to say the problem is “solved,” it’s quite another to say, “we will reduce power leakage at 45nm by 90%.”

(It’s important to note that the real improvement isn’t supposed to happen until 45nm, which at best doesn’t start until 2007. That’s two generations from now. We still have 65nm to go through, but it looks more like holding the fort than defeating the hated heat.

Perhaps more importantly, even if Intel has figured out how to keep even one stray electron from being wasted, we’re only looking at about a 50% heat reduction. Certainly nothing to poo-poo, but remember that a process shrink used to reduce the amount of power necessary to run at X speed by about 40%. Also remember that, say a 5-6GHz 50 watt processor may sound really good until you realize you’re supposed to have two or four of them by 2008.

To have a long-term solution, reducing/eliminating wasted heat is necessary, but not sufficient. Otherwise, you’re going to eventually have a bunch of CPUs that resemble lit match tips in size and heat.

This could mean architectural changes of various types (we’ll talk about one piece of speculation tomorrow). It could mean changes in chemistry. It could mean all the above.

Better Doesn’t Necessarily Mean Faster

I think there’s an assumption that if Intel can get the power way down, overclockers will then gleefully be able to crank it way up again and repeat the historical benefits.

Well, we can all hope for it, but we’d better not assume it’s going to happen. Moving to different architectures and/or chemistries could impose their own limitations.

SOI is a good example of this. An SOI CPU simply doesn’t act like a regular chip. It runs very, very coolly at low/medium speeds, much better than regular silicon, but push it past a certain point, and power requirements start jumping dramatically, and it just stops working pretty suddenly once that happens.

The point is not to say that SOI is bad, but that it has a different overclocking personality than plain vanilla silicon. This is also likely to be the case with any dramatically different CPUs.

Just to make up an example, what would Intel do if it found two new ways to make processors? Method A gives you very low-powered, cheap, but not terribly fast CPUs. Method B gives you very fast, reasonably low-powered, but very expensive CPUs. What does Intel do if that have to choose between one and the other? What does AMD do in response? The answer really depends a lot on what the two really think about multicore and mobile devices, doesn’t it?

There’s a lot of ways Intel could go with any big improvements, and much of how Intel will decide will depend on the exact nature of the improvements. Tomorrow, we’ll talk about some of the more extreme possibilities.

No matter what Intel has in mind, this IDF promises to be the most important IDF in quite some time, and it might end up telling us a lot about where computing is likely to go the next five years.

Ed

Be the first to comment

Leave a Reply