• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

G80 specs

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Brute Force

Member
Joined
Aug 12, 2006
Location
Hadera, Israel
The new NVIDIA graphics architecture will be fully compatible with Microsoft’s upcoming DirectX 10 API with support for shader model 4.0, and represents the company's 8th generation GPU in the GeForce family.

NVIDIA has code-named G80 based products as the GeForce 8800 series. While the 7900 and 7800 series launched with GT and GTX suffixes, G80 will do away with the GT suffix. Instead, NVIDIA has revived the GTS suffix for its second fastest graphics product—a suffix that hasn’t been used since the GeForce 2 days.

NVIDIA’s GeForce 8800GTX will be the flagship product. The core clock will be factory clocked at 575 MHz. All GeForce 8800GTX cards will be equipped with 768MB of GDDR3 memory, to be clocked at 900 MHz. The GeForce 8800GTX will also have a 384-bit memory interface and deliver 86GB/second of memory bandwidth. GeForce 8800GTX graphics cards are equipped with 128 unified shaders [stream processors] clocked at 1350 MHz. The theoretical texture fill-rate is around 38.4 billion pixels per second.

Slotted right below the GeForce 8800GTX is the slightly cut-down GeForce 8800GTS. These graphics cards will have a G80 GPU clocked at a slower 500 MHz. The memory configuration for GeForce 8800GTS cards slightly differ from the GeForce 8800GTX. GeForce 8800GTS cards will be equipped with 640MB of GDDR3 graphics memory clocked at 900 MHz. The memory interface is reduced to 320-bit and overall memory bandwidth is 64GB/second. There will be fewer unified shaders with GeForce 8800GTS graphics cards. 96 unified shaders [stream processors] clocked at 1200 MHz are available on GeForce 8800GTS graphics cards.

Additionally GeForce 8800GTX and 8800GTS products are HDCP compliant with support for dual dual-link DVI, VIVO and HDTV outputs. All cards will have dual-slot coolers too. Expect GeForce 8800GTX and 8800GTS products to launch the second week of November 2006. This will be a hard launch as most manufacturers should have boards ready now.

The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering. This new HDR approach comes from a file format developed by Industrial Light and Magic (the LucasFilm guys). In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it. OpenEXR's features include:


-Higher dynamic range and color precision than existing 8- and 10-bit image file formats.

-Support for 16-bit floating-point, 32-bit floating-point, and 32-bit integer pixels. The 16-bit floating-point format, called "half", is compatible with the half data type in NVIDIA's Cg graphics language and is supported natively on their new GeForce FX and Quadro FX 3D graphics solutions.

-Multiple lossless image compression algorithms. Some of the included codecs can achieve 2:1 lossless compression ratios on images with film grain.

-Extensibility. New compression codecs and image types can easily be added by extending the C++ classes included in the OpenEXR software distribution.

-New image attributes (strings, vectors, integers, etc.) can be added to OpenEXR image headers without affecting backward compatibility with existing OpenEXR applications.



NVIDIA already has 16X AA available for SLI applications. The GeForce 8800 will be the first card to feature 16X AA on a single GPU. Previous generations of GeForce cards have only been able to support 8X antialiasing in single-card configurations.

The first card, the GeForce 8800GTX, is the full blown G80 experience, measuring a little less than 11 inches in length. The GeForce 8800GTS is a cut down version of the first, and only 9 inches in length.

The marketing material included with the card claims NVIDIA requires at least a 450W power supply for a single GeForce 8800GTX, and 400W for the 8800GTS. Top tier vendors in Taiwan have already confirmed that GeForce 8800 cards in SLI mode will likely carry a power supply "recommendation" of 800W. NVIDIA's GeForce 7950GX2, currently the company's top performing video card, carries a recommendation of 400W to run the card in single-card mode. The GTX has two 6-pin power adaptors, the GTS has only one.

:beer:
 
I have the solution to high-oil prices!! Invest in the alternative energy called "G80", and all your house-heating woes will be solved!! For large mansions and businesses, "G80 SLI"!

small print: "G80" may cause house fires if turned on
 
natewildes said:
I have the solution to high-oil prices!! Invest in the alternative energy called "G80", and all your house-heating woes will be solved!! For large mansions and businesses, "G80 SLI"!

small print: "G80" may cause house fires if turned on



AHAHAHAHHA.... well whens the supossed release date.... i am comming up on 2 of my 3 monts of a stepup from evga.... i am ALMOST tempted to step up from my 7900gt KO 512mb to a 7900gto.... but it would sorta be a side step.
 
I already have a Prescott space heater! I don't need to my room to be hotter than Death Valley. PASS!
 
nd4spdbh2 said:
AHAHAHAHHA.... well whens the supossed release date.... i am comming up on 2 of my 3 monts of a stepup from evga.... i am ALMOST tempted to step up from my 7900gt KO 512mb to a 7900gto.... but it would sorta be a side step.

second week of november
 
Celada said:
Meh ill belive it when I see it

Fan version
g80_card_bg.jpg



Watercool/fan hybrid version:
g80_card1_bg.jpg


g80_angle_bg.jpg


g80_angle1_bg.jpg




Satisfied? :confused:
 
Also my sources say that the G80 will have around 700 million transistors... Though I find that very hard to believe and should be taken with a grain of salt.
 
everyone always forgets the pic of the back of it that shows the new ram config. Basically any cooler that you have now that works on an nvidia card without a seperate ram cooler will not work on this card. Waterblocks and zalman coolers, with independent ram sinks, should work but it might be hard to fit big ram sink on it with certain water cooling setups.

 
Some other notes:

-Unified Shader Architecture;
-Support FP16 HDR+MSAA;
-Support GDDR4 memories;
-Close to 700M transistors (G71 - 278M / G70 - 302M);
-New AA mode: VCAA;
-Core clock scalable up to 1.5GHz;
-Shader peformance: 2x Pixel/12x Vertex over G71;
-8 TCPs & 128 stream processors;
-Much more efficient than traditional architecture;
-384-bit memory interface (256-bit+128-bit);
-768MB memory size (512MB+256MB)


GeForce 8800GTX: 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. $649;

GeForce 8800GT: 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499;

Also The Inquirer was wrong again, they reported that the G80 has 32 pixel and 16 vertex and geometry shader processors.
 
hitbyaprkedcar7 said:
omg that is redicilously LONG

o want a picture with a ruler next to it.

It is a little less than 11 inches.

One more thing, the G80 might have around 700 million transistors, but for backround the G71 has 278M and the G70 has 302M transistors respectively.
 
Another thing, if you look closely the specs say the following:

-384-bit memory interface (256-bit+128-bit)
-768MB memory size (512MB+256MB)

The reason for that I have heard is the new physics they are putting on the card, you can find out more about it here:

With the release of the G80, NVIDIA will also release a new engine dubbed Quantum physics engine. Quantum Effects Technology is similar (at least in spirit) to NVIDIA's PureVideo Technology -- a dedicated layer on the GPU for physics calculations. A few documents alluding to this new engine appeared on public FTP mirrors late last week.

Quantum utilizes some of the shaders from NVIDIA's G80 processor specifically for physics calculations. Physics calculations on GPUs are nothing new; ATI totes similar technology for its Stream Computing initiative and for the Triple Play physics.

NVIDIA and Havok partnered up this year claiming that SLI systems would get massive performance gains by utilizing additional GeForce GPUs as physics processors. Quantum may be the fruits of that partnership, though NVIDIA documentation clearly states that Quantum will work just fine without SLI.

NVIDIA's documentation claims Quantum will specifically compete with AGEIA's PhysX, yet does not mention who is providing the middleware. Given that there are only two acts in town right now, it would be safe to say Havok has a hand in the Quantum engine.


So that would mean that perhaps the physic area of the GPGPU will get a dedicated 256MB of GDDR3 and 128-Bit bandwith as well as the 575MHz core.
 
Last edited:
Brute Force said:
Some other notes:

-Unified Shader Architecture;
-Support FP16 HDR+MSAA;
-Support GDDR4 memories;
-Close to 700M transistors (G71 - 278M / G70 - 302M);
-New AA mode: VCAA;
-Core clock scalable up to 1.5GHz;
-Shader peformance: 2x Pixel/12x Vertex over G71;
-8 TCPs & 128 stream processors;
-Much more efficient than traditional architecture;
-384-bit memory interface (256-bit+128-bit);
-768MB memory size (512MB+256MB)


GeForce 8800GTX: 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. $649;

GeForce 8800GT: 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499;

Also The Inquirer was wrong again, they reported that the G80 has 32 pixel and 16 vertex and geometry shader processors.

They were right with the 768mbs, and other stuff... they may not get everything 100% but they did get some stuff right. I am guessing they go to as many sources as possible then try to figure out from that what it might be.
 
Bloody hell. :eek: :drool: :drool: Well I think we've found the first card that'll run Crysis at high settings on it's own. Interesting that nVidia went with the unified shader architecture, I read an interview which stated that nVidia were still debating whether its advantages were worth it, guess they are. The prices are extremely high, at max I have about $400 to spend on a card... :bang head :santa:
 
Back