• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia Prepares For 40nm GPUs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Shiggity

Member
Joined
Dec 16, 2007
Location
Chicago, IL
VR-Zone has learned that Nvidia is moving on to 40nm process much quicker than expected. Nvidia is usually conservative when moving on to a finer process technology but seems like this time round, they are determined to get there ahead of AMD. This move is risky but probably Nvidia has done some serious simulation and prototyping to estimate the yield and stability of the TSMC 40G process technology. The first GPUs on 40nm process technology will be the mobile parts codenamed N10x, the successor to the current 65/55nm NB9x. Nvidia is expected to apply a die shrink to their desktop parts too to lower cost and thermal envelope so there might be a chance to see a GX2 solution for GT200.

http://www.vr-zone.com/articles/Nvidia_Prepares_For_40nm_GPUs/6020.html
 
Well zippity-do-da. I know the 55nm refreshes are just around the corner but I'm curious when the 40nm will come out.. My bet would be not until Q1-09.
 
It sounds like they are just skipping 55nm and going straight to 40nm.

A 2x GPU card with 40nm GT200 cores sounds pretty sweet :D
 
It sounds like they are just skipping 55nm and going straight to 40nm.

A 2x GPU card with 40nm GT200 cores sounds pretty sweet :D

That would also put them seriously behind ATI in terms of foot-race. Remember, it's going to be a while before 40nm chips in notebook form hit the market, and that's their first step.

By the time that happens, ATI will have revised RV700 cores on the scene, and good yield for lower prices. By the time 40nm desktop chips come out?

Nvidia needs to cut prices, take the loss and move on to the next generation. It was clear they had kind of an overcompensating idea with G200. After seeing the 4800 series in action, hopefully they learn, "Bigger isn't always better".
 
That would also put them seriously behind ATI in terms of foot-race. Remember, it's going to be a while before 40nm chips in notebook form hit the market, and that's their first step.

By the time that happens, ATI will have revised RV700 cores on the scene, and good yield for lower prices. By the time 40nm desktop chips come out?

Nvidia needs to cut prices, take the loss and move on to the next generation. It was clear they had kind of an overcompensating idea with G200. After seeing the 4800 series in action, hopefully they learn, "Bigger isn't always better".

I agree with you, but the time frame is what they didn't mention and obviously is the most important factor. To me, the article made it sound like these would be coming relatively soon.

I'm sure they are rushing to get their new mobile stuff out after that horrible incident they had where all their mobile GPU's were just failing and they had to eat a ton of losses. They probably decided to axe those and start on the new 40nm versions after that.

Nvidia is still in decent shape, the G92 cards are still pretty decent and the GT200 are still pretty competative now. They have some time before they HAVE to get some new cards out.
 
expect a 40nm baised g200 type chip to come out late Q1 2009 at the earliest which would line it up with the rv870 chip family which will also be 40nm.
 
expect a 40nm baised g200 type chip to come out late Q1 2009 at the earliest which would line it up with the rv870 chip family which will also be 40nm.

Where'd you see that information on the RV870?

A new core less than a year after RV700, jeeeeeeeeeeeez if true. GPU's at a better manufactering process than CPU's, will that be a first?
 
ATi seems to be trying to get back onto the 6 month refresh cycle that they where on before getting bought by AMD and to put it lightly struggling with the r600 chip.

Although this is getting off topic the main goal of the rv870 is to reduce the power evelope more so then get more performance. They are shooting for twice the performance per watt but expect only a 20% performance increase compaired to the rv770.

However ATi planted fake rv770 info pre launch that made it look worse then it actually was to catch nVidia, and everyone esle, by surprise so dont take this as solid fact much like the article that started this thread which basically assumes that the gt200 will evetually get the 40nm treatment as well.
 
No reason why they can't :p GPU's use too much power and produce the majority of heat in a PC, they should be the component that manufacturers concentrate on becoming smaller + more efficient.
 
No reason why they can't :p GPU's use too much power and produce the majority of heat in a PC, they should be the component that manufacturers concentrate on becoming smaller + more efficient.

Screw that, give me bigger and faster! :D

I needs real time ray tracing soooooooooner.
 
I'd say assuming that the GT200 will get the 40nm treatment would be a good bet. Makes the cards better than the first batches as well as lowering variable cost per unit on nVidia's end.
 
No reason why they can't :p GPU's use too much power and produce the majority of heat in a PC, they should be the component that manufacturers concentrate on becoming smaller + more efficient.

That's not even a top 5 goal for GPU manufacturers, because they already do that.

Power conscious users (i.e. workstation, basic home use) don't use performance graphics cards. Even workstation cards have low heat dissipation and power consumption.

Most enthusiasts that use high level gaming cards, aren't concerned with power consumption or heat dissipation. If it can get the frames, that's what matters.

It's like saying, a sports car needs good gas mileage. Yeah better gas mileage would be a selling point, but everything else comes first: horsepower, handling, leather seats. Hell even the cupholders would come first to a sports driver versus the gas mileage.

The only reason GPU manufacturers go for die shrinks is for better performance with the same architecture. Smaller & cooler = being able to use the same tech at faster speeds in bigger configurations.
 
The G80 was 90nm. That was rather fast I'de say.
I doubt the 200s will get 40nm because they're just about to give it a 55nm revision and it's rare for a single generation to have more than one die shrink; if anything, 200 GOs might get 40nm (If Nvidia ever makes any).
 
Back