Not Just Gamers . . .

No one (well, no one sane on the subject) will ever accuse me of being an AMD fanboy.

Nonetheless, some of the utterings from financial types just leave me shaking my head.

First, AMD released its earnings, and given the price war, their performance was impressive.

Nonetheless, their stock got hammered because their gross margin (i.e., their profit before the indirect expenses of running the company) dropped from 56% to 51%. Earth to Wall Street, that kind of happens during a price war. Intel’s has dropped more and is now lower than AMD’s, but its price has actually gone up a bit.

Now we have some analyst who says AMD’s going to have to dump the graphics card unit

Why does he think that?

Well, first, (yes, I’m exaggerating a bit, but not much), he thinks the future belongs to integrated graphics. That may well be, but “future” covers a lot of ground, and one could argue that integrated graphics is the wave of the past and present, too. Nonetheless, there is a video card market for people who want better than that.

Second, he thinks nobody who buys Intel will ever buy an ATI video card.

Apparently, the reason why he believes that is because Intel and nVidia are unlikely to support Crossfire technology (like they certainly would have if they would have had AMD not bought ATI), and since nobody buys fewer than two video cards at a time these days, that pretty much shuts the door in AMD/ATI’s face.

It gets better. He says at one point that OEMs “want to be bound to a single GPU supplier.” I see, there’s so many GPU manufacturers to choose from! Who wants to be bound to ATI when you can become a slave of nVidia instead? Do you think even Intel would want that? No wonder why they’re mumbling about making serious video cards.

The brilliance continues.

He mentions Intel building their own cards. Those who recall the Intel 740 might say, that’s easier said than done, and video cards are vastly more complicated these days. Intel’s experience with integrated graphics wasn’t sufficent then, and certainly wouldn’t be now without some serious reinforcement from cutting-edge people, and, more importantly, some time. But gee, by the time Intel gets its act together, separate video cards will be going extinct, so why bother investing big time in dinosaur futures?

He says AMD is “unlikely to integrate a high-end, billion-transistor GPU.” No, not immediately, but until they can, that’s what the video card department is for, and the lessons you learn from there, you transfer over to any CGPU project you might have.

It is suggested that AMD/ATI could sell GPU technology licenses to create a revenue stream. Uhh, sell to whom? Who would want to buy? nVidia? I don’t think that’s a good idea, even if nVidia likes it.

Intel? Hmmm, we don’t want to make advanced video cards, so we’ll let Intel do it on the cheap instead.

To top it all off, he points out that getting rid of the video card unit would only cost ATI about 65% of the company’s revenues. A trifle.

Clearly, this guy has a major problem with ATI video cards, and he’s not going to let anything like contrary facts or consistency or financial sense stop him. You’d think a rogue Radeon wiped out his family (or at least his gaming clan) once, and he wants revenge.

No, that’s not the reason. It’s worse, much worse than that. The end of the article finally reveals why video must go: the gross profit margins on video cards aren’t big enough, so it must go to get the percentages up a few points!

Gasp!!! Who needs Halloween when you have horrors like this staring you in the face!

What this all boils down to is that someone wants to make his favorite number go up soon, no matter what the real-life consequences are.

It’s not just gamers who get overly obsessed with a single benchmark.

Ed


Be the first to comment

Leave a Reply