Larrabee: Intel’s Ideological Chip

If you’re going to get Larrabee, you need to understand that it is an ideological chip.

What do I mean by “ideological” when it comes to a graphics chip?

The question for Intel is, “How can we most leverage our advantages in the chip industry and turn our competitors’ ways of doing business into disadvantages?”

The short answer is Larrabee (and future derivatives).  The longer explanation is  “We design a computing world based on a swarm of  x86 processors that we can build cheaper than our competition’s because we have better chip-making technology than our competitors.  We’re going to try this out on the graphics industry first.”  

Did you notice I didn’t say anything about performance?    That’s because Intel doesn’t need to win in order to succeed.  Larrabee doesn’t have to be too good; at least not initially, but it does have to be cheap.

Before you start yelling, consider the latest marketshare numbers for graphics:

Intel:        47.3%
nVidia:    31.4%   
AMD:       18.1%

Not even Paul Otellini would claim that Intel’s current graphics are world-beaters, even if you limit the field to just integrated graphics.  And yes, Intel now has a bigger lead than usual, but they do normally come in first in this contest, even though they don’t make discrete graphics cards.  

How can this be?  The answer is:

1) Only a relatively small proportion of people want advanced graphics and
2) Intel can wrap so-so but good enough cheap graphics in a relatively better chipset package and sell that successfully. 

Larrabee no more needs to beat the highest-end  nVidia or AMD GPUs in the hottest games when it comes out than the x3100 needs to beat nV/AMD notebook offerings today.  What it needs to do is be (profitably) a good deal cheaper for X level of performance than the nV/AMD offerings. 

To put it another way, if Intel can provide the same graphics power with Larrabees at half the chip cost of even lower-end nVidia/AMD GPUs, they may not get great reviews, but they’ll have a real winner on their hands.  Even if the first generation or two or three of these Larrabees aren’t so good, so long as they’re decent enough to be competitive in the low- to medium end, they can gut nVidia/AMD’s lower-end product lines by making them unprofitable.  Believe it or not, nV/AMD would go quickly broke if all they could sell was $500 video cards.  The graphics industry works on economy of scale, too (though not as much as the CPU industry).  High-priced items get the attention and make the profits, but the far larger lower-end sales pay the company’s bills.  If all nV/AMD could sell were high-end cards, that $500 video card would become a $1,500 card pretty fast, and the companies would price themselves out of existence. 

It’s not easy to see how nV/AMD could get the cost of their GPUs down quickly if Intel’s approach ends up being much cheaper.  Having someone else make the GPUs saves nV/AMD on capital costs, but it also means that the cost of a GPU is considerably higher, too.  This will probably be aggravated by the fact that the fab yields on a huge GPU will likely be less than a much smaller Larrabee cluster.  Both companies can ill-afford big fab costs (which couldn’t be done even with the money for years, anyway), and copying the idea behind Larrabee would take a lot of time, too.  

But Larrabee isn’t just about a video card.  It’s much, much more than that.  What Intel eventually wants is total flexibility in product design, with the single Larrabee-type core as the basic building block.   Put one or a dozen or eventually a hundred in a graphics card, inside a regular CPU, even make it a substitute for a regular CPU.   If a nV/AMD GPU design isn’t quite competitve enough; it’s back to the drawing board.  With Larrabee, if X clusters don’t quite cut it in a product, make it X+1 clusters.  

You may say, “What about Atom?  What about Intel’s “regular” processors?”  The answer to that is Intel is covering all the angles, and each angle has the potential to enhance the others.   Future Larrabees could be based on different Intel CPUs, and future Intel CPUs can incorporate Larrabee technology.   

We could go on and on about the potential advantages Larrabee could give Intel, then go on and on about the potential disadvantages Intel would face, but I think an analogy would be better.  The discrete video card industry is a lot like organic food stores.  They’ve led a fairly sheltered life, and are fairly small.  They compete against each other, but neither has the interest nor wherewithal to seriously try to push the other to the wall via pricing.  

Intel is like Walmart.  They are not small, and they have the interest and wherewithall to push competitors to the wall.  Larrabee is like Walmart saying to the organic food stores, “We’re jumping into your market and taking over.”  Yet Walmart doesn’t always succeed, and neither does Intel.  

Both Intel and Walmart are what they are because they’ve invested vast amounts of money building up huge infrastructures to fully exploit economies of scale.  Intel obviously knows how to make CPUs cheaply and well; Walmart can sell merchandise, even food, cheaply and well.  But selling organic food or advanced graphics are new businesses than require new infrastructures which will not be exact copies of what they already do.  Walmart has worldwide commercial food chains down, but networks of relatively local organic farmers are a different story.  Intel has CPUs down, but getting first dozens, then hundreds of tinyCPUs to do GPU work well will require creating huge new hardware and software infrastructures into being.   Both would have to rely on independent outsiders who may not be able or may not even want to supply them; in the case of Intel, experienced programmers may not want to learn new x86 graphics tricks.  Finally, a significant chunk of customers don’t like or even fear these big companies simply because they are big companies.            

Then again, while you can easily see why the two wouldn’t be sure-fire winners in their new businesses, you probably wouldn’t bet on them being sure-fire losers, if only because both have the wherewithall to keep at it and learn from their mistakes.  Larrabee is not a one-shot deal, but a long-term threat.  Even failure could be a success.  Larrabee is really an excuse to practice designing and building many-CPU systems and get paid for it.  If Larrabee fails as a video card, but Intel learns enough to make one of its children become the dominant tinyCPU in cellphones some years from now, would you call that a failure? 

Larrabee looks good on paper, a bit too good, as a matter of fact.  It seem suspiciously convenient for Intel to say that the future lies solely in its new philosophy and its fabs.  The potential flexibility of the Larrabee philosophy is breathtaking until you begin to consider what and how much Intel needs to do to make it actually work.  A “Show me” attitude is definitely in order here; this could be a Merced/Itanium in the making.     

But if they succeed, the first, or second or third time, God help Intel’s competitors.   

Ed

About Ed Stroligo 95 Articles
Ed Stroligo was one of the founders of Overclockers.com in 1998. He wrote hundreds of editorials analyzing the tech industry and computer hardware. After 10+ years of contributing, Ed retired from writing in 2009.

Be the first to comment

Leave a Reply