• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

G80 specs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
This card is insane!! I think that I will wait for now, and see how these things turn out. I am also curious as to what ATI has to offer in terms of next generation video cards.
 
Brute Force said:
Also my sources say that the G80 will have around 700 million transistors... Though I find that very hard to believe and should be taken with a grain of salt.

I am assuming a 90nm fab?
 
dreamtfk said:
I am assuming a 90nm fab?

Could be the 80nm ones they've been talking about next for the GPU's for what the last few months. Unless they haven;t profect the 80nm ones.
 
deathman20 said:
Could be the 80nm ones they've been talking about next for the GPU's for what the last few months. Unless they haven;t profect the 80nm ones.

Ah thats right I remember reading that as well I think you are correct. Wow 700million trans on 80nm this thing is gonna need some serious cooling, I hope its not another vacuum cleaner like the FXs were :rolleyes:
 
deathman20 said:
That one part has me confused. 400W for the GT ok no problem, 450W for the GTX ok still no problem but 800W for the SLI? That just doesn't add up unless they think that adding 1 extra card is going to really eat up another 350W thats insane. The figures don't add up properly with that at all.


Like I said, it is for OEM system builders. So that they won't put some el-cheepo 600W generic PSU whos rails are all over the place, not to mention pure power efficiency.

-------

DreamFTK -

So far all sources say it is TSMCs 90nm still. The next revision will either be to TSMCs 80nm or 60nm (not a mis-type) process. Hence also the limit in core clockspeed currently and the extreme high price.
 
well winters comeing i'll be buying one. it gets kinda cold in the winter and space heaters can burn your house down. so i'll just let the G80 keep my house a tosty 95 deg when its 20 deg out lol.
 
How much power/wattage can the card draw from the 6 pin pci-e plugs? As much as the power supply can give it... Or is it limited to 75w per connector... Ive seen review over at legit reviews with a 7950GX2 using (at load) 284W... The 7950gx2 and the 8800 come with the same PSU reccomendation... So are we loooking at a max draw of 284 watts?
 
NightWolf_8800 said:
How much power/wattage can the card draw from the 6 pin pci-e plugs? As much as the power supply can give it... Or is it limited to 75w per connector... Ive seen review over at legit reviews with a 7950GX2 using (at load) 284W... The 7950gx2 and the 8800 come with the same PSU reccomendation... So are we loooking at a max draw of 284 watts?

The 7950GX2 is only suppose to use less power then a X1900XTX suppositly, but I'd say it definatly uses less compared to the X1950XTX.

from H said:
Power is an important component to note about the GeForce 7950 GX2. In the past putting two GPUs on a single PCB have resulted in the need for a lot of external power. The GeForce 7950 GX2 is a power and heat friendly video card and this results from NVIDIA re-designing the layout so that it is operating on two PCBs, with new power circuitry and a reduction in clock speeds for each GPU. NVIDIA claims that the GeForce 7950 GX2 will demand less power than a single ATI Radeon X1900 XTX and it will be much less offensive to the ears as well.

The minimum power supply requirement specified by NVIDIA is 400 watts with 27A on the 12v rail. The peak power draw reported by NVIDIA is 143 watts from this video card. That is less power than two GeForce 7900 GTX cards in SLI but more than a single 7900 GTX.

Heres a great table all look quiet good on power consumption as well right within the ball park.

http://www.atomicmpc.com.au/forums.asp?s=2&c=7&t=9354

As well noted I was off thought the 6 pin connectors put out ~75W, wrong its 150W each. So right there knowing that to PCI-E can put a max out of 75W and 150W on the PCI-E connector you can come to the conclusion that the cards will be between.

225W or less for the 8800GTS
225W - 375W for the 8800GTX
 
that is absolute insanity. This card physically will not even fit into the case I have right now. Not to mention it would explode my power supply as well. I'm very close to selling my desktop and going with a 15-17" laptop to stop the madness.
 
this card is insane. if it is offered in quad sli i think in order to afford it and the system to go with it i am gonna have to take out a small loan from the bank...i would say15 large should do it...
 
nd4spdbh2 said:
AHAHAHAHHA.... well whens the supossed release date.... i am comming up on 2 of my 3 monts of a stepup from evga.... i am ALMOST tempted to step up from my 7900gt KO 512mb to a 7900gto.... but it would sorta be a side step.


New GPU will be out next month.

High ends cards will be out and the speeds and spec's I will have soon.

I can not give out detals I was told but it cool from what I have seen...
 
What a piece of crap:bang head . Video card makers need to get in the real world in the heat and power department. This is a new low for video card technology. Unbelievable.
 
A X1950 XTX on a very high end righ consumes a total of 295 Watts (the entire rigs power amount under load, not the individual card). For comparison a X1800 XL in the same rig takes up 229 watts and a X1300 XT takes up 193 watts.

I am going out on a limb on this one but I am confident in it. The two PCI-E power connectors (6-pin) are there not as much as to supply the card with 150 watts, but to ensure a stable and clean supply of power in case the PCI-E slots power fluctuates which it does. So what I am saying is it definately needs at least one PCI-E 6-pin worth of power, but it does not require either the full amount of the other 6-pin or the PCI-E slots power; but it is there just in reserve so there are no problems.

Also if you think this is bad, just wait for the ATI R600 which rumors have it is being even a bigger hog.... But there is hope, both Nvidia and ATI claimed that either their refresh or next series will start a trend to reduce power consumption.
 
Back