• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia Considers GDDR5 in Upcoming 40nm GT214

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Fudzilla.....'nuff said. LOL

Seriously though, that wouldnt surprise me if the GDDR3 doesnt scale high enough for the next gen graphics cards. Only time will tell.
 
um, I actually wrote this article myself. Let me know what you think, as I am a new guy on their team (was looking for work, they offered, and I thought they could use a credibility boost so I opted for the job). ;)


Anyways, Fudo seems to have removed all of the source links. I had originally posted a hyperlink to the Linked In url as well as the forum post where I got the information from. I will speak to him later about this.



FYI, xbitlabs has a similar article going:

http://xbitlabs.com/news/memory/dis...Support_GDDR5_with_Code_Named_GT214_Chip.html
 
Interesting... Hopefully FrAUD can shake off some of the horrible history its had... cheers to you for stepping up and taking that chance! Article was well written though for the limited information you have to go off of. :)

Anyway, Anything is possible. What is very peculiar to me is the fact that the last stuff he worked on were mainstream cards. If the mainstream cards have them on them, I cant imagine the high end stuff NOT having it on there.

Here is to hoping.
 
I am still wondering why Nvidia has GDDR3 on their cards. GDDR5 with a 256bit bus has higher bandwitdh than GDDR3 with a 448bit bus.
 
I am still wondering why Nvidia has GDDR3 on their cards. GDDR5 with a 256bit bus has higher bandwitdh than GDDR3 with a 448bit bus.

ATI was incredibly lucky to get GDDR5 (while GT200 and RV700 were being made, GDDR5 specs were not even done). GDDR5 and RV700 literally went into production at the same time, if GDDR5 is delayed there ATI gets owned / delayed.

Now is the right time for Nvidia to go to GDDR5.

Unfortunately GDDR memory is looking like it's a thing of the past. Rambus just came out with XDR2 graphics memory. (Rumored to be put onto Intel's Larrabee GPU and probably all GPU's after 2010). Doesn't look like there will be anything over GDDR5, unless this new ram is crazy expensive to make and GDDR is the 'cheap' stuff.

Look at this craziness - http://www.rambus.com/us/products/xdr2/xdr2_vs_gddr5.html

Sorry for off-topicness and fanboiness to the Larrabee =D. But Nvidia and ATI will get it too ;)
 
Unfortunately GDDR memory is looking like it's a thing of the past. Rambus just came out with XDR2 graphics memory. (Rumored to be put onto Intel's Larrabee GPU and probably all GPU's after 2010). Doesn't look like there will be anything over GDDR5, unless this new ram is crazy expensive to make and GDDR is the 'cheap' stuff.

GDDR7 is on the horizon already... and GDDR5 at over 20Gb/s will be in production by mid 2009.

RAMBUS = crazy expensive licensing.

While the whole narrow bus, high clockspeed is neat... they always forget to mention the absurd latencies involved (there's a reason the PS3 doesn't use it's XDR for graphics)
 
Back