To Be Or Not To Be

Intel doesn’t think it will have to go to a 64-bit desktop until 2008 or 2009, despite AMD heading down that road with Hammer and Apple probably following with an IBM chip which looks a lot like MacHammer.

Two Approaches

When Intel talks about 64-bit, it’s not the same thing AMD is talking about.

When Intel talks about 64-bit, they’re talking about something like Itanium, something built from the ground up to run at 64-bit.

When AMD talks about 64-bit with Hammer, they’re essentially talking about a (comparatively) simple kludge: x86-64. It’s a very clever kludge that gets a lot done with little effort, a great bang-for-the-buck kludge, but it’s still a kludge.

In the long-term, Intel thinks the Itanium approach is better, and I think they’re right, but as the dates show, we’re talking long.

What Intel is saying is that the desktop is not going to really need Itanium-like processors for a long, long time, they have some good reasons for that, and that’s what this article is about: the desktop.

4GB, Not For You Or Me Soon

Do you have your machine stuffed with 4GB, and your life is a living hell because of the memory addressing limitations of your CPU (and OS)?

Outside of a few people running big servers and maybe a serious scientific type here and there, no. Most of those reading this probably have 256-512Mb, and very few top 1Gb.

So for the average or even not-so-average computer user, 4Gb is a far-off boundary, not a barrier, and will remain so for a long time, especially when the increase in size of memory modules is due to slow down (the article linked above describes that in some detail).

Page 2: 2 + 2 Does Not Equal 64…

Email Ed

2 + 2 Does Not Equal 64

In the near future, and for years to come, millions of morons will say something like, “64 is double 32, so 64-bit is twice as fast as 32-bit.”

Wrong. Here’s why so you can be the caller and not the callee. 🙂

Even AMD, who if anything is biased on this issue, states that 64-bit operations should yield an “average” (see below) performance improvement of about 20%.
20%. Not 100%.

Why?

Put very simply, you need 64-bits of real data to be processed before 64-bit operations do you any real good.

To show you what this means, let’s multiply 2 X 2 in binary:

In 32-bit binary, here’s what 2 X 2 would look like:

00000000000000000000000000000010
00000000000000000000000000000010
________________________________

00000000000000000000000000000100

Lots of zeroes, aren’t there?

Let’s do it in 64-bit binary

0000000000000000000000000000000000000000000000000000000000000010
0000000000000000000000000000000000000000000000000000000000000010
________________________________________________________________

0000000000000000000000000000000000000000000000000000000000000100

Any improvement, besides a lot more zeroes? You did twice the work, and got the same result. You’ve “wasted” the extra work.

The point is that much of a computer’s operations does not and cannot benefit from extending the number of bits a CPU can take at one swallow. For something like this, it’s like having two people yank your jaws wide open to accommodate a peanut. You still get the peanut down, but you didn’t do it any faster or “better.”

The benefit from 64-bit comes when you have a large proportion of data/instructions that is “really” more than 32-bit long, and which can be handled in one shot rather than two, like a huge chunk of meat that would normally take two swallows to get down.

The more a program “needs” 64-bits to do something more efficiently than 32-bits, the more of a real performance improvement you’ll get from it. It doesn’t happen “naturally” or “automatically.”

Let’s take 2 X 2. You don’t need 32-bit anymore than 64-bit to do that, you only “need” 3-bit to do that. Of course, if the answer to the next question is eight or more on your 3-bit computer, you have a bit of a problem.

Intel is basically saying that the desktop isn’t hurting for 64-bit any time soon, and for most things, that’s quite true.

This doesn’t mean there aren’t activities out there that couldn’t use 64-bit processing. They’re just not Joe Sixpack desktop activities, and usually tend to be on the serious scientific side. Huge databases (and I don’t mean Joe Jr’s baseball card collection) also benefit a lot from 64-bit processing. For more, but not too many more, details on this, go here

Intel doesn’t deny that some things could use 64-bit processing now. In their view, that’s what Itanium is for. Serious 64-bit is just not a Joe Sixpack activity.

Could gaming benefit from 64-bit? In time, they probably can and will. But it will take a lot of time and effort to program a game from the ground up to effectively use 64-bits and be able to do things you can’t do well or at all in 32-bit.

And it’s hard to see good reason for Intel to move mountains to bring Itanium Light to the desktop particularly quickly just for a few games.

Page 3: Erratic Results…

Email Ed

Erratic Results

For that reason, I think you’re going to see very erratic benchmarks from Hammer when it uses 64-bit apps. You’re going to have “64-bit” games and applications where there’s really no point in it being 64-bit because the end result will be just a lot more zeroes. Those apps will show little if any gain.

The same will be true for initial “64-bit” games and apps that are just ported over from 32-bit with little work done to optimize the programming that could be done to take advantage of those sixty four bits.

Finally, there will be apps that heavily and seriously use 64-bits, and the gains will be huge. Problem is, they’re not likely to be ones that you would use even at gunpoint.

There’s a real danger synthetic benchmarks could be seriously abused, just write your tests and make sure you use “real” 64-bit code exclusively.

Benchmarks that use averages of a number of programs and don’t tell you what the individual scores are will be suspect, too. If Program #1 shows an 80% improvement, and Programs #2, #3, #4 and #5 show 0%, the average “improvement” of 16% isn’t indicative of anything. (The initial Opteron benchmarks should also be checked to see how common that “average” improvement of 20% ends up being.)

This Doesn’t Make Hammer Bad. It Doesn’t Make It Good

On the desktop, AMD is and always has been faced with a less-than-optimal situation:

Unless it can convince huge number of people that they must have 64-bit no matter what, Hammer will sink or swim based on how it does against the PIV in 32-bit.

Whether x86-64 software will ever go mainstream for the desktop is very questionable. Successful niches, yes, but will tons of people buy an 64-bit OS and then 64-bit applications? Not impossible, but I wouldn’t hold my breath.

If they do, and Hammer starts taking market share away from Intel, all Intel does is put in (or just activate) their own x86-64 kludge into Prescott or Tejas.

So the most AMD can expect is to get while the getting is good for a while and force Intel to copy them. That’s hardly bad, and worth doing; it’s just not world conquest.

A Marketing Weapon In The Hands Of?…

–>

Email Ed

A Marketing Weapon In The Hands Of?

As a practical matter for the general population, 64-bit is more a marketing weapon than anything else.

It’s probably a better marketing weapon than most, but a weapon is only as good as the soldier.

AMD and Apple are the two 64-bit soldiers of fortune.

In marketing wars, AMD is the regular Iraqi army.

Apple is more like the French army. They have great style and class, they often get praised for their valor, and they do everything except win.

It’s not inconceivable this dynamic duo could convince the world that they have to have 64-bit, just very unlikely. If the real marketing magician doesn’t plan on trying this trick for at least five years, that ought to tell you something.

Again, that doesn’t mean AMD can’t do well with Hammer. It doesn’t have to depend on 64-bit to win. It had better not depend on 64-bit to win.

If it can look the PIV in the eye in 32-bit, and 64-bit is just icing on the cake, it could do very well.

If 64-bit ends up being the meal rather than the icing, though, then AMD will have big problems on the desktop.

Email Ed

Be the first to comment

Leave a Reply