64 Cores?

Add Your Comments

XBit Labs tells us about this report which says that 64-core machines will start showing up in servers in 2009, and desktops in 2012.

Let’s leave aside the server side of this prediction, it’s both conceivable and there’s good reason for such a beast to exist.

Let’s also leave aside the technical feasibilty of building a 64-core processor by 2012: having a dual-socket system with two 32-core processors using 22nm process technology is roughly the equivalent of a dual-socket 65nm Barcelona system today.

No doubt a select few for whom no level of processing power is enough would be happy to have and pay a pretty penny for such systems.

But what about the rest of us?

Let’s not obsess about the specific number 64; 32 or 16 or even 8 will serve just as well.

Just what is the average Joe going to do with 8 or 16 or whatever processors? What could he do with so many?

Whatever that is, it has to be something he or she isn’t doing now.

Take maintenance out of the hands of the average user? Not a bad idea, and probably the most real-life useful task imaginable, but really, how many dedicated CPUs would you need for that?

Making computers instant-on/off would impress many, but you don’t need four extra CPUs to do that, you need to change the boot process to an imaging system.

Beyond that, it’s pretty hard to see how extra processors will help the browsing/email/light computing crew.

A Solution Looking For A Problem

Multiplexes of processors are a solution looking for a problem. Normally, that cliche implies the lack of a problem. That’s not quite the case here. Some people definitely have a problem, and there are lots of different potential problems, but it’s important to understand what is common to that problem.

If a task takes a lot of computing, it takes serious real time to compute with just a CPU or two, and the task isn’t inherently sequential, then, and only then do you have a problem that can be solved with extra CPUs.

That being said, there are three types of real problems: problems that involve huge quantities of data, problems that require huge computational manipulation of the data, and problems that require both.

Let’s give some examples of these types of tasks:

Writing a letter or preparing a single tax return with a computer is a good example of a task not helped by a lot of CPUs. The amount of data is minimal, as is the amount of computation required. The task is inherently sequential, and the real factor in how long the task takes is the speed of human input.

Adding a thousand numbers together is another task ill-suited to multiplexes. Yes, you can split up the task, and sixteen CPUs will do that a lot “faster” than one, but when the difference is .01 seconds versus .001 seconds, there is no difference in human time.

On the other hand, adding ten billion numbers together, or adding ten million sets of a thousand numbers each is a task very well suited for multiplexes. The computation is trivial per addition, but the sheer quantity of data that needs to be manipulated means a serious real-time difference between one and many CPUs.

An artificial intelligence program for chess starts with little data, but generates exponentially growing amounts of data, which then must be compared to each other. Multiplexes are called for here.

Rendering involves both huge quantities of inital data, and elaborate manipulation of such data. Again, multiplexes are significantly better.

As I think you’ll see, most tasks performed by the typical computer person do not require lots of CPUs. But that’s not true of all tasks by all computer people.

So what do you do?

The CPU companies have already figured this one out: modularity. Build basic CPU and GPU “blocks” which can be glued together as needed.

By 2012, the average “processor” the average person will use will either consist of a CPU and GPU, or (more commonly) two-three CPUs and a GPU. Those with greater needs will get more CPUs (and more of everything else).

Yes, there will probably be 64-core desktops, and more desktops with at least umpteen cores. The catch is there aren’t going to be too many “desktops” around in 2012.

Oh, plenty of people will still do computing at a desk, but the big box attached to the screen will be replaced by a little box, or just built into the monitor or its base.

Those who need the power will get it (and, not so incidentally, pay a lot for it) and those who don’t, won’t.

Ed


Leave a Reply

Your email address will not be published. Required fields are marked *