AMD and Intel Earnings

Every once in a while, you hear rumors (which in this case constitutes contagious wishful thinking) that
Apple is going to go x86.

In the last couple years, these rumors have mostly involved Apple
using AMD processors, and the wishful thinking is becoming increasingly desperate because even Maclunatics are beginning to realize
that 3GHz is more than 1GHz, no matter what Uncle Stevie says.

A few days ago, the Mac world was a bit stunned to hear press reports that Steven Jobs did not dismiss
the notion of Apple going to x86, perhaps as early as 2003.

Unfortunately, for both Apple and AMD, this was about as manufactured a story as you can get.

Reuters said the following:

“Some analysts have also urged Apple to move to microchips from Intel Corp. from those made by Motorola Inc. and International Business Machines Corp. to cut costs.”

“Asked about that possibility, Jobs said that first the company had to finish the transition to the OS X operating system, expected around the end of this year.

“‘Then we’ll have options, and we like to have options,’ he said.”

This turned into:

“Apple CEO Steve Jobs said this week that his company would consider moving to Intel chips, but that he would wait until at least 2003 because the transition to Mac OS X was more important.”

There’s only one small problem with all this. Here is what the man apparently said in total:

“The roadmap on the PowerPC actually looks pretty good and
there are some advantages to it. As an example, the PowerPC has
something in it called AltiVec, we call the Velocity Engine — it’s a
vector engine — it dramatically accelerates media, much better than, as
an example, the Intel processors or the AMD processors… so we actually
eke out a fair amount of performance from these things when all is said
and done. And the roadmap looks pretty good. Now, as you point out, once
our transition to Mac OS 10 is complete, which I expect will be around
the end of this year or sometime early next year and we get the top 20%
of our installed base running 10, and I think the next 20 will come very
rapidly after that. Then we’ll have options, then we’ll have options and
we like to have options. But right now, between Motorola and IBM, the
roadmap looks pretty decent.”

Sounds quite a bit different, doesn’t it? Sounds more like Michael Dell talking about looking
at AMD every once in a while, doesn’t it? Actually, almost makes Michael Dell sound half-AMDroid in comparison.

The Reality for Apple

The PowerPC is the only thing standing between Apple as it is, and Apple turning into a software company.

There are basically two big differences between an Apple machine and an x86 machine.

  • It uses a PowerPC CPU/mobo rather than an x86 CPU/mobo.
  • It uses MacOS/MacOS X rather than Windows.

    Besides that, there’s no real difference. The vast majority of components in an Apple computer are off-the-shelf PC components. That was Mr. Jobs’ major contribution to Appledom when he came back (and in all fairness, it probably saved the company).

    Apple will tend to include relatively new types of components earlier in the production curve than PC OEMs, but it’s rare that you can’t get the same thing for your PC.)

    If Apple goes to an x86 processor,
    there will no longer be any reason to pay Apple anywhere from a couple hundred to a thousand dollars extra for the privilege of running MacOS-whatever.

    For sure, Apple would try to stop that by trademarking and copyrighting the BIOS up the kazoo, just like IBM did almost twenty years ago.

    Then somebody would reverse-engineer the BIOS, like Phoenix did almost twenty years ago.

    Then Apple would sue them, like IBM sued Phoenix almost twenty years ago.

    Finally, Apple would lose, like IBM did almost twenty years ago, and within six months of the decision, the whole Taiwanese mobo gang would have mobos perfectly capable of running Windows and MacOS X-86. For about $10 extra.

    Then Dell would offer them, and Apple hardware would be wiped off the face of the earth within a year, along with most of Apple.

    Except the OS guys. Watching MacOS X competing against Windows might be real interesting. There’s certainly enough people ticked off at MS, and if Apple can’t make Unix cuddly enough, no one can.

    Personally, I think MacOS X-86 would have grave problems competing, simply because they’d have to come up with eight zillion drivers for all makes and models of old and new equipment, and I don’t think they could do it quickly enough.

    The biggest problem though, is Steven Jobs. He and Apple didn’t, and still don’t, accept the notion that computing should be an activity for everybody. They still think it should be an elite activity, with a price tag to match. Well, at least their kind of computing.

    Giving up the PowerPC means giving up any realistic claim, no matter how delusional, of being different (and, in at least the Mac-mind) better. They’d stop being a big fish in a small (and shrinking) pond, and get chucked out in the ocean with the sharks.

    So, no, Apple will never go to an off-the-shelf x86 system. Should they ever lose Motorola and IBM as suppliers, they’ll pay Intel or AMD to make their CPUs “different” from the regular ones. Different as in incompatible.

    The Reality for AMD

    We think of AMD as little David confronting Big Goliath Intel, but in the CPU business, AMD is even a bigger Goliath compared to Apple than Intel is to AMD.

    First quarter, AMD made eight million processors, or ten times the number of CPUs needed by Apple.

    Second quarter, AMD makes only six million processors. That’s still seven-and-a-half times as many as Apple’s total requirement.

    Let me put it this way. For AMD to have gotten back the unit sales they lost last quarter, they would need all the sales from two-and-a-half Apples.

    Put another way, Apple and XBox are about the same-sized markets.

    Extra business is extra business, but if Apple decided to use Hammers soon, AMD would be doing Apple (really the Mac users) a big favor, not the other way around.

    Ed

  • Be the first to comment

    Leave a Reply