• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

G80 specs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
GDDR3? Aren't the new X1950XTX's already using GDDR4? So the new generation of nVidia is still using GDDR3 while the current ATI are using GDDR4? uh, ok...
 
Sean_Best said:
GDDR3? Aren't the new X1950XTX's already using GDDR4? So the new generation of nVidia is still using GDDR3 while the current ATI are using GDDR4? uh, ok...
The nVidia's also use 80000 more watts per card
 
soulfly1448 said:
768mb, that sounds kinda strange. Whya not 512 or 1GB?

Brute Force said:
Another thing, if you look closely the specs say the following:

-384-bit memory interface (256-bit+128-bit)
-768MB memory size (512MB+256MB)

The reason for that I have heard is the new physics they are putting on the card, you can find out more about it here:

So that would mean that perhaps the physic area of the GPGPU will get a dedicated 128MB of GDDR3 and 128-Bit bandwit as well as the 575MHz core.
 
NICE

hopfully by this comeing spring ill have some extra $ to make an other computer to go with the one i have now and i may have to pick up this 8800gtx :)
 
Sean_Best said:
GDDR3? Aren't the new X1950XTX's already using GDDR4? So the new generation of nVidia is still using GDDR3 while the current ATI are using GDDR4? uh, ok...

The G80 is compatible with GDDR4, however as GDDR4 are in very short supply and still carry a price premium vs. performance it would be a very bad decision to add a further small performance improvement to a already very fast card for a price bump on a already expensive card, not to mention that supply issues.
 
And not to mention that ATI is currently the only graphics company that even has acess to GDDR4. :p (ATI was one of it's developers)

After reading up on the G80 alot... I'm somewhat shure I'll pass on it. Microsoft and ATI have been working together more then nvidia and ATI's r600 is expected to be much more efficent in terms of how well it runs the Direct3D API.
Damn that's big... it'll be a pain to overclock without a very good liquid cooling rig.
OMG! That thing would consume %60 of the power my computer is sucking from the wall.

I'm still likeing the whole monsterous amount of power that it has.
 
Brute Force said:
The G80 is compatible with GDDR4, however as GDDR4 are in very short supply and still carry a price premium vs. performance it would be a very bad decision to add a further small performance improvement to a already very fast card for a price bump on a already expensive card, not to mention that supply issues.

The x1950 seems to get about a 5% performance increase with GDDR4, I guess you could say that's a small improvement. Of course this initial GDDR4 is only a small speed bump over max'd out GDDR3. GDDR4 is also supposed to show power use improvements over GDDR3 although it's pretty obvious that power use wasn't a consideration ;)

I'll go with the conclusion that GDDR4 might be a tad early to implement at the current time but NV better have put that capability into the chip for the spring refresh. By that time it will be competing against r600 and be closer to being a practical consideration timing-wise. They seem to have gone with the wide memory bus approach which is a valid way to add memory bandwidth it's just complex to implement.
 
That one part has me confused. 400W for the GT ok no problem, 450W for the GTX ok still no problem but 800W for the SLI? That just doesn't add up unless they think that adding 1 extra card is going to really eat up another 350W thats insane. The figures don't add up properly with that at all.
 
1 card uses 400W, so if you run SLI you need 2 cards, both of the cards are the same so 400W x2 is 800W. SLI won't make the cards use less power.
 
N1NJA said:
1 card uses 400W, so if you run SLI you need 2 cards, both of the cards are the same so 400W x2 is 800W. SLI won't make the cards use less power.

If you use 400W for one card, what is the rest of the system running on then? Its surely not power then.

They have had to set aside at least 200W+ for just the system alone and not the GPU. As saying you shouldn't need as a big PSU as they said.
 
Maybe you probably do since most of the high wattage PSUs are going to be multi-rail and especially with the GTX the GPU rails will be shared with other components.
 
I'm curious where they get the name 8800 from. Why isn't it like previous releases where they left more numbers for modifications on the design, thus making 8600, 8700, 8900, etc? They're probably just guessing again?

Isn't the hybrid air/water cooled design not efficient compared to any all-watercooled block?
 
maybe i will buy that 1000w psu on sale in cyberdeals section...ya right

EDIT- both ati and nvidia use the same nameing system the first Number is the generation the 2nd is the performance teir 2-3 is low end 5-7 is medium and 8-9 is high end and when they add 50 it usually means a slight update or sometimes just a newer bios.
 
MadMan007 said:
Maybe you probably do since most of the high wattage PSUs are going to be multi-rail and especially with the GTX the GPU rails will be shared with other components.

I guess with maybe the multi rails it just seems wierd that say the GTX uses ~ 225W Max (75W per connection). Just seems wierd why theres such a big boost, but I guess there really isn't a 700W PSU on the market though.

Power figures just taking a guess here for max power of the cards.

PCI-E pumps out 75W
each 6-pin connector pumps another 75W out

So you get the figures of 150W for the GTS and 225W for the GTX for max consumption so to say. I don't know what the 6-pin connectors are rated for but im sure someone here knows then at least we can grasp a power feeling off these cards.
 
And an 18A rail, which many multi-rail PSUs have, is 216W. I'm sure the GTX doesn't use 100% of the power it's capable of being fed but that certainly isn't much margin for error.

I'll tell you what's going to shoot NV in the foot with regards to this: when former SLI-approved power supplies are suddenly not '8xxx SLI' and less capable consumers get burned because of it. Is there a new version of SLI approved?
 
z0n3 said:
I already have a Prescott space heater! I don't need to my room to be hotter than Death Valley. PASS!

I was actually gonna say something similar to that, pretty nice in the winter tho. Imagine the price on this card tho.
 
Back