AMD reiterated/confirmed that it will start making quad-core processors in 2007 the other day.
From all indications, Intel is going in the same direction and will do the same in roughly the same time period.
Excuse me, but isn’t this, like, an overkilling waste for the desktop?
Dual cores I can see. Software that takes advantage of two CPUs is hardly where it should be, but it will get there.
But four? Just what does the average person (or could do) that would be much better done with four than two processors?
Even if you can find (or make) something for all those CPUs to do, will operating systems be able to efficiently divy up the work?
If operating systems can’t do that well, or require clunky manual intervention to do so, then why bother? Why should Grandma end up with (and pay for) two essentially-useless CPUs? What does this exercise become besides being a way to prop up CPU prices?
At an absolute minimum, whether it’s Windows Vista or Linux or (by 2007) MacOS X, the task manager functions of an OS will become much more important than it is today. Yes, additional CPUs can handle maintenance work usually ignored/neglected by the typical user today. Average people will buy more reliable computers.
But if OSs aren’t designed/programmed to handle tasks like that, “innovations” like four-core will be just a geek’s playground, and a really rough sale to anyone else.
So when you hear about four-core, you shouldn’t be too interested in what AMD or Intel are going to do. You should be much more interested in what Microsoft or Linus’ Army of Penguins or Apple are going to do to handle them.