Dual Cores: A Hot Topic

We’ll be hearing a lot about dual cores the next week or so.

AMD just showed a demo off, Intel will just talk about it next week.

Conceptually, dual cores are not that big a deal. All you’re doing is the equivalent of motherboard wiring for a dual processor system on a chip.

Practically, it’s trickier than that and requires some good engineering, but it’s no big innovative leap, provided you’re working with CPUs that are good to begin with.

That’s the situation AMD appears to be in. They aren’t doing anything particularly new or interesting, but that’s a good thing, because they have good enough already.

Intel is another story. Unlike AMD, it can go down one of two paths, and they haven’t decisively indicated which one they’ll take yet.

The question for Intel will be: Will their initial dual-core chips look more like Prescott, or like Dothan?

They probably won’t be exactly like Prescott, simply because melting through the motherboard is not a desirable feature, no matter how crazed the PR person is.

A rather bolder step would be to go with dual-core Dothans, but it’s unlikely a dual-core Dothan would beat any likely dual-core Hammer in anything other than power conservation. Given that, it’s doubtful that the propeller-heads at Intel will be able to bring themselves to shed the habits of a lifetime, at least not quite yet.

So what Intel will probably end up doing is try to kludge enough Dothan technology onto a Prescott foundation to lower the thermostat enough to get it out the door.

Somehow, I have a bad feeling about that.

The Real Issue For Us: Heat

Unless Intel bites the bullet and decides to give the world essentially dual-core Dothans, the biggest problem overclockers are going to face, whether AMD or Intel, is powering and cooling the thing.

There are two reasons for this.

The less obvious one (and a relatively new issue for overclockers, single or dual-core) will be power gradients. By that I mean power consumption jumping up more rapidly with every speed increase than in the past.

In other words, instead of power consumption going up like this:

Old

It goes up like this:

New

Given that overclockers generally push chips beyond the maximum rating and add fuel to the fire by adding voltage, a steepened power gradient is bad news indeed.

That’s really the problem with Prescott when it comes to hands-on overclocking. It’s not so much that Prescott is extraordinarily hot at any speed as much as it just gets a lot hotter a lot faster than previous generations of CPUs.

From (admittedly scanty) advance indicators, 90nm Hammers suffer from the same phenomenon (if not as severely). It would not be surprising if a 90nm 2GHz Hammer is almost cold compared to the same processor running at 2.8GHz or better.

For single-core 90nm Hammers, this isn’t likely to be all too much of a problem (outside of those with wimpyish coolers). At the least, the problem won’t be as bad as it is with Prescott.

However . . . .

There’s Two Of Them!…

There’s Two of Them!

You have to presume any dual-core Opteron running at a decent speed is going to chew up at least 150 watts, and probably closer to 200 watts.

Unless gets Dothan-bold, it will be as bad or worse with any Intel dual-core.

If you look ahead to overclocking dual-cores, you’re probably looking at needing to cool something in the neighborhood of 250-300 watts.

That isn’t going to be too easy, especially when heatsink technology is already getting rather pushed by today’s requirement.

True, cooling two 100-125 watts 100mm processors is a bit easier than cooling one 200-250 watt processor, but only that.

True, the intensity of the heat won’t be any worse than with single-core CPUs, but the total quantity of heat to be dispersed will double.

Of course, dual-processor systems manage to survive today. However, with dual-cores, these two light bulbs are only going to be about an inch (roughly 2.5 cm for you metricsexuals out there :)) away from each other, and sharing a single-platform.

Can such chips be cooled? Of course they can be cooled, that’s not the issue. The issue is the price one has to pay to cool them.

At the least, you’re going to need a pretty hefty power supply to warm up the things, and a lot of (noisy) wind to cool them back down again.

Yes, you can use water. Yes, you can use freeze units. Yes, you can spend a lot of money or time/effort to do so. No, most current overclockers won’t do that.

Bravado aside, a large majority of overclockers still use the old fan-and-heatsink, and if that won’t cut it anymore, they’ll just find something else to do.

Killing Quietly With Kinetics?…

Killing Quietly With Kinetics?

There are reasons why people do things, and there are incidental fringe benefits you happen to get from a course of action.

There’s no doubt the CPU companies have far bigger and better reasons to go dual-core than to squeeze out overclocking. You’re a tin-foil hat kind of guy if you think otherwise.

Nonetheless, from the perspective of AMD and Intel, while they would gladly swap cooler, non-leaky processors for a few more overclockers, when you have lemons, you make lemonade.

Of course, these space heaters won’t kill overclocking. It will just greatly reduce it where it has been up to now by stepping up the price of entry.

Or . . . . people will move, and start playing new games with different toys. They’ll find something else that they can play with at a reasonable price. In the next couple years, it could be single processors, could be Dothans, it could be Via chips, it most likely will be something we don’t even know about yet.

The tinkering will go on, somewhere, somehow. Just the nature of it will change.

Ed

Be the first to comment

Leave a Reply