What About nVidia? . . .

After looking through a number of forums, the general question being most often asked about the AMD/ATI merger is, “What happens to nVidia?”

We’ll try to answer some of questions falling into that general category in this article. If you’d like to hear at least some of the answers from the horse’s mouth, you might want to read this interview of Dirk Meyer.

1) Does that mean I won’t be able to use a nVidia card in an AMD/ATI board anymore? No, it doesn’t mean that at all, anymore than not being able to use a nVidia card in a current ATI motherboard. That’s what
standards are all about. You have a PCI-E video slot, any PCI-E video card is going to work in it.

Perhaps in a few years, it may turn out that “ATI” video cards might work better using AMD-ATI systems than nVidia systems, but that may never happen, and even if it does, it probably won’t happen until 2008 or later, when the two companies really coordinating with each other on projects. We’ll talk about the deep future a bit later.

2) Will nVidia be allowed to continue making AMD boards? Yes. Given that nVidia has about 90% of the AMD mobo market, I don’t think they have any choice in the matter. 🙂

However, in the course of time, it probably will be fair to say that the AMD mobo business could become . . . well, biased against nVidia. What will most likely happen is that corporate users will want all-AMD platforms, but that’s no different than corporate buyers wanting all-Intel CPU/mobo solutions.

It is possible that AMD might share some extra “secrets” with the ATI guys down the hall than nVidia, but again, look at Intel. They haven’t been exactly friendly over the years to chipset competitors, but competitors have survived.

The area where nVidia will probably be most hurt will be for motherboards with integrated graphics, but again, that’s largely a corporate issue.

3) But ATI motherboards suck! One of them killed my grandmother!! OK, no one quite said the second part, and we’re not saying the first part is necessarily true, either.

But even if they did/do suck, being part of the company will give any mobo makers excellent incentives for improving matters. 🙂

4) Won’t nVidia get frozen out when graphics start being incorporated into CPUs? First, that isn’t going to happen tomorrow. Probably won’t happen anyplace, anywhere, for at least two years.

I think a quote from the abovementioned article ought to settle a lot of jangled nerves:

“[Dirk Meyer] said he expected the graphics processor (GPU) would remain separate in some markets, notably those that require the fastest 3D rendering, such as high end gaming, “for as long as I can see” (our emphasis). But he said in other markets, where you want the lowest cost or the lowest power, it could become a single chip. In particular, he said he could see the combined company creating a platform for the needs of emerging markets through a better job of integrated graphics, consistent with AMD’s 50×15 vision.”

I think from that and other information you can derive the following:

a) You won’t see graphics-in-CPU for anything more than very basic video use on the low-end in the next few years. You’re probably more likely to find a chip like this in your phone before your PC.

b) On the higher end, you may well see some separate graphics co-processors in play, but a separate chip is a separate chip. nVidia can make socketed chips, too, and Torrenza is supposed to be an open standard.

c) Talk of high-end video incorporated into the CPU is going to be a post-2010 phenomenon, and may not happen at all, if for no other reason than GPUs are awfully big, bigger than most CPUs. Realistically, you might want to try it in a few niches using 45nm process technology, but you’ll probably need 32nm and a hell of a lot of redesign to do it comfortably. If it does happen, by the time it does, the PC is going to be so radically different from what it is today that today’s concerns will be irrelevant, if not silly.

The most likely path for GPU-in-CPU is for companies to start at the bottom and work their way up. Socketed GPUs are the most likely interim step, and probably not before 45nm.

5) Will Intel buy nVidia? Could they? Yes. Will they? Probably not. In the long term, the only reason why Intel would buy nVidia would be to get better video, and Intel culture doesn’t like “not made here.” Then again, Intel’s been more than a bit shaken up lately, so what was inconceivable just a few years ago is no longer inconceivable today.

In the short term, buying nVidia would rather mess up AMD’s short-term motherboard picture, but that’s like shooting someone after you’ve thrown him off the roof. I’m sure Intel would like to, but spending six billion dollars and chucking a good chunk of the purchased company’s sale just to be extra spiteful isn’t going to happen, especially when Intel is going through financially rough times.

Conclusions

Nothing terribly bad is going to happen to nVidia in the short term. In the longer term, nVidia’s best bet is to keep doing what they’ve been doing, raising video standards so that they can’t be ignored. Whether it’s a video card, a socketed chip, or some licensed circuit design, nVidia can continue to survive and thrive.

Ed


Be the first to comment

Leave a Reply