• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Apple may drop Nvidia as GPU manufacturer

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
How much money does nvidia really make off chipsets though? Seems like they make most of their (substantial amount of) money off GPUs.
 
There's an article coming down the pike on overclockers.com about a coming Apple release in the next month or so. Mostly speculations right now, but the signs point to updated hardware from Apple. Stay tuned to the frontpage for details.
 
The more concrete if non-juicy reason is at the end of that article: NV won't be able to make chipsets for Nehalem and derivatives. However if the 'spat of heated words' means Apple rejects NV for discrete graphics too that would be bad for NV even if only in image. Macs are still a small percent of total computer sales, the chipset loss hurts NV but is not surprising, a discrete graphics loss would be painful and less expected especially since pricy 'supercomputer' marketed Macs are exactly the place NV could sell GT300.
 
Certainly, but calling a chipset a GPU because it has a gpu in it seems inaccurate to me. That makes my g31 boards chipset a GPU. It does have a (lousy) GPU in it, certainly.
 
Nvidia has confirmed they are no longer developing NEW chipsets, but the general concensus is that Nvidia has one more Chipset revision that is ready to go (likely what the next batch of Mac Books/MBP's will use). After that, Apple will indeed need to look elsewhere for chipsets and specifically integrated GPU's. SL is relying on GPU processing for a fair deal of its programs from what I gather, and Intel's current GPU offerings aren't up to snuff...

PC Perspective has lots of good info on this stuff...

:cool:
 
nvidia is kind of stuck, if they can't legally make a chipset for anything intel that has an IMC (read: everything except 775), and 775 is dying (which it is), that leaves them making chipsets for AMD, their direct competitor. That certainly isn't likely.
Hence, they're leaving the chipset business. I makes good sense to me.
I think even the next generation of Atom chips have an IMC.
 
I heard a rumor that Nvidia might somehow use the open-architecture PCIe lanes off the new Intel CPU's as an interface to get aorund using Intel's protected/proprietary DMI and/or QPI busses. Since the 16 lanes of PCIe coming off the CPU are an open standard, this IS a viable option from what I gather, but it's practical application might be limited (and Intel *might* find a way to legally stop this, but PCIe being an open standard, I don't really see how). More good stuff from PC Perspective :)

:cool:
 
From inside the article:

Those claiming to be inside the discussions have told SemiAccurate, the new project of a previous Inquirer editor with sources inside NVIDIA, that Apple may not agree to another such deal for 3-4 years as a result of the heated words. It wouldn't result in an immediate exit, as the recentness of implementing NVIDIA chipsets into nearly all Macs means some models will keep their existing designs for a long time, but could already result in some comparatively near-term updates shedding the NVIDIA platform.

More Charlie crap. All I need to read.
 
You guys are funny about that guy, heh.

Once apple is done with 775 (i give it a year, tops), they're done with nvidia, regardless of what charlie said.
 
Certainly, but calling a chipset a GPU because it has a gpu in it seems inaccurate to me. That makes my g31 boards chipset a GPU. It does have a (lousy) GPU in it, certainly.

You're not reading between the lines though. Sure, we know NV won't have a Nehalem-derivative chipset, but if Apple is tiffed enough they might drop or ignore NV *discrete* solutions as well, both mobile and desktop, and also NV embedded solutions like Tegra. That would be an additional loss for NV on top of the expected loss of mobile chipsets.

I heard a rumor that Nvidia might somehow use the open-architecture PCIe lanes off the new Intel CPU's as an interface to get aorund using Intel's protected/proprietary DMI and/or QPI busses. Since the 16 lanes of PCIe coming off the CPU are an open standard, this IS a viable option from what I gather, but it's practical application might be limited (and Intel *might* find a way to legally stop this, but PCIe being an open standard, I don't really see how). More good stuff from PC Perspective :)

:cool:

So you're saying they could make a mmm...coprocessor I guess you'd call it, hung off the PCIe lanes? I say coprocessor because it only makes sense to add it for GPGPU apps, Intel integrated graphics are good enough now for HD acceleration. That could be interesting but it's not terribly innovative when you think about what it really is. Maybe they'd be able to override the built-in graphics to use it to output to a display however when you think about it that's just practically the same as a discrete GPU. It would have to be a significant advantage to add that back in to laptops that could otherwise go without, same as discrete graphics now. On the desktop it might have some use if they make it powerful enough but then you might as well talk about a regular video card.

You guys are funny about that guy, heh.

Once apple is done with 775 (i give it a year, tops), they're done with nvidia, regardless of what charlie said.

Yes, this is one case where Charlie is 'right' but it's a pretty obvious statement and not unique information.
 
I think what will end up happening is with the next gen "nehalem" mac books is Mac is going to use discreet graphics. It does cost more, but it will allow them to keep the GPU acceleration and solves the whole intel problem.
 
Yes, this is one case where Charlie is 'right' but it's a pretty obvious statement and not unique information.

I have no doubt that he is correct here and he is probably correct in his other articles about nVidia, however, his absurdly negative spin on all things nVidia makes it hard to discern the truth. I have no doubt that this issue is about Apple wanting to move on to better processors (i5, i7 and i9) and nVidia's chipset limitations. What I seriously doubt is that this is because of some fighting between the two companies. Coming from Charlie, I can safely ignore the negative, which doesn't leave me with much more than what is obvious.
 
Yes, Charlie's problem is not content is that he sensationalizes everything.

As for the influence of bad blood I wouldn't put that beyond Apple, they've done it before. And while Apple's 'it just works' marketing is a farce anyway - Apple stuff breaks all the time - they surely don't like when it comes from another company's product. The cost of losing integrated chipset isn't good for NV but it was known, if the bad blood affects desktop, discrete mobile, and embedded design wins too that would make NVs 'bumpgate' mistake and arrogance in handling much more damaging.
 
Back