To a limited degree, the Business Week article is good because it at least notes AMD’s economic weaknesses, something propeller-heads often ignore. However, it also shows how stupid beancounters can be, too. They pretty much get everything else wrong, or see a tree here and there, but not the forest.
For instance, they get the hots for 300mm wafers, which is a good thing to have, but when you consider that it will save Intel maybe $10 a CPU, it’s hardly a crushing economic blow, and wouldn’t be unless CPU prices were a lot lower than they are now.
The article is rife with inaccuracies, irrelevancies, overemphasis and omissions, with a few close-but-no-cigar observations. Nonetheless, in general, they’re right for the wrong reasons.
Over the course of time, we’ve pointed out that AMD is faced with a short-term and long-term financial crisis. The short-term crisis was getting Hammer up and going while trying to live off aging Athlon sales.
While there’s still some doubts about AMD being able to do this, an upsurge in CPU demand and lack of pricing pressure from Intel has at least relieved much of the immediate pressure from AMD and bought it some more time.
However, the long-term financial problem still remains.
Fab costs keep escalating, and require more and more CPUs to be made per fab to pay for those costs. Manufacture itself is getting trickier and trickier as the architectures get smaller and smaller and plain old silicon no longer can hack it.
It is economies of scale that will eventually doom financially weak companies like AMD from operating like a little Intel. Indeed, it is hard to see how the big Intel can keep up business as usual for more than a few more generations if current trends continue.
The key phrase is “operating like a little Intel.” AMD doesn’t have to follow that route.
This article outlines three alternative routes they could take.
SOI: Slave Of IBM
AMD simply can’t afford a new 65nm fab plant. It couldn’t afford one even when times were good; much less now.
So it looks like they’ll depend on IBM to make their chips for them by the time 65nm rolls around.
At first glance, that doesn’t seem to be too big a deal. After all, companies like nVidia and ATI use foundries all the time to get their CPUs, and they seem to do OK.
The CPU industry is a high fixed cost, low variable cost industry. If AMD becomes an IBM customer, it won’t have to pay five billion or so for a new fab, but IBM is not the fairy godmother. It has to pay for the fab, and it will by charging its customers a hefty sum for using it. IBN will be the company pocketing any profits from economies of scale, not AMD.
Intel, on the other hand, will retain lower costs/increased profits from any economies of scale.
IBM doesn’t make many CPUs compared to Intel. That’s why it’s looking for customers, to keep its one new fab busy. Since its production is a bit iffier than Intel’s, it will likely charge its customers more rather than less initially to recoup its fab costs.
On the other hand, Intel has a big enough market so that it doesn’t have to worry about spare capacity in new fabs yet; it can just build fewer new fabs.
Both these factors mean that Intel should be able to produce CPUs at much less cost to itself than AMD, and thus have the flexibility to undercut AMD if it so chooses.
It’s doubtful Intel would ever try to undercut AMD, but an IBM foundry deal would likely put limits to AMD’s ability to undercut Intel in the future.
One can exaggerate the benefit of lower production costs. It’s very important in many, probably most, industries, but the CPU industry is not like most industries. If you can get $150 for your product, it’s not a life-or-death matter whether it costs you $10 or $20 to make it. The difference affects the level of your profits, not the likelihood of your future existence.
What’s probably more threatening in the long run is how independent AMD can really be if they become that dependent on IBM.
There’s one big difference between AMD and ATI or nVidia (or Apple, for that matter). ATI and nVidia don’t sell video chips. They sell video cards. Apple doesn’t sell CPUs, they sell computers. For them, a GPU/CPU is just a component in their product, and they add unique value to their products in other ways. For nVidia and ATI, since both get their video chips the same way, there’s no competitive disadvantage. Apple doesn’t really compete in the computer market based on price, so their CPU cost doesn’t matter a whole lot.
For AMD, though, the chip is the product. Take fab making away from AMD, and all you have left is a design and a sales team.
Nor is IBM a disinterested chip foundry like a TSMC or UMC. It is also a CPU competitor. It would be very convenient for IBM if Hammers evolved into a PowerPC clone, or if IBM took the best parts of Hammer technology and decided to give Intel a run for its money in the transition to 64-bit.
It also buys its desktop CPUs from Intel.
There seems to be this general feeling that IBM would be AMD’s fairy godmother. To me, it looks more like sticking your head inside the mouth of a lion and trying to live off cleaning his teeth.
OK, it’s a lion that isn’t particularly hungry and seems downright friendly. For the moment. Until, maybe, Intel makes an offer too good for IBM to pass up, for instance, and the price IBM has to pay is AMD.
Being a pawn in IBM’s grasp isn’t an unalloyed benefit. IBM can play this all sorts of ways with all sorts of consequences, from fantastic to fatal. The lion might fight the tiger out to kill you. Then again, it might bite your head off and offer your carcass to the tiger, too.
The A Way
The “A” in “The A Way” stands for “Alpha.” AMD could decide to forget about competing against Intel across the board, and just make a relatively handful of high-performing CPUs at high prices to compete at the high-end.
Since they’d no longer be in the mass production business, a few extra dozen dollars in production costs would hardly matter.
The problem with that approach is that AMD would have to effectively lay off almost everyone in the company since they would no longer be needed, and pin all their hopes on surviving in a market they’ve only recently entered.
It should also be noted that Alpha hardly fared well taking this approach, either.
The V Way
The “V” in “The V Way” stands for “Via.” The Via approach is, “To hell with CPU power, make computers small, quiet and cheap.”
Via’s main problem is that their products take the “to hell with CPU power” too literally; they are rather underpowered.
But what if AMD took the same approach? They certainly have better CPU designs than Via, and a 90nm or 65nm SOI version of an Athlon XP or mini-Hammer running at just 1.5GHz could give Via a run for its money in the small, quiet and cheap category.
AMD could execute the Via strategy better than Via, and an AMD-Via alliance/merger would even be better. Leave Intel the big boxes, and see how popular they remain with Joe Sixpack and Suit when a real, cheaper alternative emerges.
Fab making would continue to be a worry, but not chasing higher and higher speeds would make it easier for a less dominant foundry maker like TSMC to handle that, and since the emphasis would be on tiny, good-enough CPUs, costs could be kept reasonable.
A risk? Of course, it’s a risk. AMD would be staking the company on a much different vision of what the PC will be in the future than Intel’s.
But AMD can’t beat Intel playing its own game. Look at what happened with Athlon. For a year, AMD had a decided advantage over Intel. What happened? AMD went from 18% to 23% marketshare. People just won’t leave Intel for a me-too product, and sorry, but to Joe Sixpack, Hammer will be a me-too.
Even if the demand had been there back in the Athlon days, AMD couldn’t have possibly supplied it; it just didn’t have the capacity to do so, and it never will so long as they try to match Intel in making more and more complex and faster chips.
It’s time to start playing a different game.
AMD should stop banging its head against a wall trying to outIntel Intel with no money and instead offer tiny chips for tiny systems good enough for 80-90% of people with computers; chips tiny enough so you or somebody else can eventually make three or four of them with the same die space as Intel’s one. Stop running the rat race. Give Joe Sixpack something different and cheap. Embrace miniturization, and aim for the day of the cellphone/computer.
It may not work, but what’s happening now isn’t, either.