• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GeFroce 9800 GX2 Pics and Specs!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Why would anyone compare an 8800GTX ultra to a 9800GX2?

non-SLI VS SLI.... Um.. wrong!

Maybe 9800GTX vs 8800GTX and then see how much faster it is in term of a percentage.

The 9800GX2 is in a league of its own since there is no 8800GX2 to compare it too..

EDIT: I too agree the 9800GX2 is two 8800GTS 512's (G92s) sandwiched together..
Then again, maybe thats what it will be, the 8800GX2, and what leaked out was wrong....

The true new leader would be the 9800GTX G100.

They are comparing the 8800Ultra and the 9800GX2 because nVidia is. The 9800GTX is the 8800
Ultra's replacement.

Also you can not always go by the major model number 8800, 9800, etc as far as GPU generation.
8900 was the logical choice for the die shrunk G92 cards but NV called them 8800 instead just like
the 80nm GPU cards.

8900 would have been the logical choice for the G92 GX2 as well since it is essentially just two
G92 GTS card glued together.

Unfortunately the marketing departments have a tendency to not do model numbering logically.
They are more concerned with the bait to put on the box that will catch the most fish as they
swim down the aisle's at their local computer lake than being logical. A 9800GX2 sounds/looks
to be a 1000 to 900 point bigger meal to a hungry Userfish than a 8800GX2 or 8900GX2 would
at the same price!

Viper
 
Last edited:
They are comparing the 8800Ultra and the 9800GX2 because nVidia is. The 9800GTX is the 8800
Ultra's replacement.

Also you can not always go by the major model number 8800, 9800, etc as far as GPU generation.
8900 was the logical choice for the die shrunk G92 cards but NV called them 8800 instead just like
the 80nm GPU cards.

8900 would have been the logical choice for the G92 GX2 as well since it is essentially just two
G92 GTS card glued together.

Unfortunately the marketing departments have a tendency to not do model numbering logically.
They are more concerned with the bait to put on the box that will catch the most fish as they
swim down the aisle's at their local computer lake than being logical. A 9800GX2 sounds/looks
to be a 1000 to 900 point bigger meal to a hungry Userfish than a 8800GX2 or 8900GX2 would
at the same price!

Viper

Does not matter to me either way what the numbering is. We still know it will be older technology (compared to the D9E) in the 9 series part....

None the less, having a SLI in a single card solution would be nice to see again. I'll be looking for bench marks when it debuts...
 
Does not matter to me either way what the numbering is. We still know it will be older technology (compared to the D9E) in the 9 series part....

None the less, having a SLI in a single card solution would be nice to see again. I'll be looking for bench marks when it debuts...

The D9E is not new gen. It is still using the current G92, 65nm GPU's. They might
refreshed and might carry a G100 designation but they are still G92, 65nm's at heart.

Viper
 
Last edited:
I just wish that Nvidia would work on the SLi Drivers, Crossfire has been maturing really well but SLi seems to pretty much be at a stand still. Nvidia's solution instead of making good drivers for SLi is just tacking on another card. I wonder how the HD3870x2 will compare to this?
 
I just wish that Nvidia would work on the SLi Drivers, Crossfire has been maturing really well but SLi seems to pretty much be at a stand still. Nvidia's solution instead of making good drivers for SLi is just tacking on another card. I wonder how the HD3870x2 will compare to this?

I have always found SLI to be more mature and compatible than a CF across the board. It has
been a while since I did anything CF but not all that long ago it was crap when it came to games.

Viper
 
yeah its changed lately, Crossfire has matured really really well its pretty close to 100% per card is most tests that i've seen. I'm afraid this 9800 GX2 is probably going to go the way of the 7950x2's. Nvidia's counter attack with the maturing Crossfire support is pretty much "toss in another card and screw drivers"

Also I wonder how this will fair to the HD3870x2 thats due out soon? but rather 2 cards being tied together, its 2 GPU cores on 1 card, it looks neat can't wait to see benches on it
 
yeah its changed lately, Crossfire has matured really really well its pretty close to 100% per card is most tests that i've seen. I'm afraid this 9800 GX2 is probably going to go the way of the 7950x2's. Nvidia's counter attack with the maturing Crossfire support is pretty much "toss in another card and screw drivers"

Also I wonder how this will fair to the HD3870x2 thats due out soon? but rather 2 cards being tied together, its 2 GPU cores on 1 card, it looks neat can't wait to see benches on it

Ditto. It makes more sense to have the 2 GPUs on the same PCB, so the cores can share the vRAM. With two separate PCBs you end up wasting a lot of high speed DDR3 (or DDR4).

Maybe ATI can save some $ on vRAM, and release a very powerful card at a good price. I surely hope so, b/c ATI taking the crown back will only mean good things for gfx card technology, and DAMMIT needs all the $ they can get right now to stay afloat w/ Intel's domination of the CPU market.
 
Pretty much just as many as SLI, a lot more if you just cheat it and change the exe's name to something like oblivion.

Ya there's never been any sort of accountability as far as that is concerned from either company. ATi as of late is in a pretty rough position, if they intend to keep their current plan of making the R700 modular they need to have their drivers as far as multi-gpu support is concerned, *flawless*. So far I havent seen that on their behalf.
 
Back