• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GeFroce 9800 GX2 Pics and Specs!

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
IMO not even 2 of these new 9800GX2's in "quad sli" would be able to run crysis @ greater than 30fps with all very high settings in dx10 at a measily res of 1680x1050, that is unless drivers become REALLY good all of the sudden to provide almost linear scaling, but as viper pointed out the 4th card may only provide 5% increase over tri sli... and tri sli was pretty much a joke.
 
Hmm, the other post concerning the 9-series included something about a 9800GTS in Feb/March. Hopefully the performance/price will be worth it to step up :)
 
I remember Viper posting in an other thread about this card saying its basically 2 8800gt's running at lower clocks
 
With only a 30% increase over the 8800Us though? Who cares if there is a die shrink.


nVidia is just trying to figure out what they are gonna fill the gap in before they actually need to release a true 9800 card.

im scratching my head where this came from rain... esp since i quoted someone saying that 2 8800GT's would be faster... i asked how could it be faster if the 9800GX2 is infact nothing more then 2 8800GT's in SLI. i dont even consider ocing my GPU any more so ocing doesnt matter...

Nooo... Overclocking doesn't matter... TO YOU. I don't think they really had you in mind. They don't make $800 video cards for people that have no intention of overclocking. Sure there's the occasional kid with rich parents or some guy making closer to seven than six figures who doesn't know what to do with his money.

But they make $800 video cards for people that want to see 50,000 in 3D06. The idea being that once five people do it the entire kingdom of overclockers... specifically ViperJohn, everybody in the top 5 on this site, and every other overclocker's site in the world will buy it.

If I remember right the 8800U had less than a 30% increase over the GTX... without overclocking. Yet if you jack up the voltage and put it under water...

And I wasn't even talking about going THAT far.

As for "who cares about a die shrink"... Maybe you didn't read what I wrote.

Two 8800GTs have a larger die and, especially with stock cooling, generate much more heat. IF (and I bolded, italicized, and underlined that word just in case you miss it) the die shrink leads to significantly lower temperatures it could lead to a dramatic increase in core speed. A 9800GX2 might be clocked roughly the same as an 8800GTX SLI... but if it can overclock to 1000mhz... and run at that speed around 45c...

Then all of a sudden it matters now doesn't it?
 
Nooo... Overclocking doesn't matter... TO YOU. I don't think they really had you in mind. They don't make $800 video cards for people that have no intention of overclocking. Sure there's the occasional kid with rich parents or some guy making closer to seven than six figures who doesn't know what to do with his money.

But they make $800 video cards for people that want to see 50,000 in 3D06. The idea being that once five people do it the entire kingdom of overclockers... specifically ViperJohn, everybody in the top 5 on this site, and every other overclocker's site in the world will buy it.

If I remember right the 8800U had less than a 30% increase over the GTX... without overclocking. Yet if you jack up the voltage and put it under water...

And I wasn't even talking about going THAT far.

As for "who cares about a die shrink"... Maybe you didn't read what I wrote.

Two 8800GTs have a larger die and, especially with stock cooling, generate much more heat. IF (and I bolded, italicized, and underlined that word just in case you miss it) the die shrink leads to significantly lower temperatures it could lead to a dramatic increase in core speed. A 9800GX2 might be clocked roughly the same as an 8800GTX SLI... but if it can overclock to 1000mhz... and run at that speed around 45c...

Then all of a sudden it matters now doesn't it?

+1
 
if the cards can be OC'd to 1000 mhz ( without burning down my house and inflicting crazy scars over myself), it would make my day, and many days following after that. We'll see in a month or two.
 
Nooo... Overclocking doesn't matter... TO YOU. I don't think they really had you in mind. They don't make $800 video cards for people that have no intention of overclocking. Sure there's the occasional kid with rich parents or some guy making closer to seven than six figures who doesn't know what to do with his money.

But they make $800 video cards for people that want to see 50,000 in 3D06. The idea being that once five people do it the entire kingdom of overclockers... specifically ViperJohn, everybody in the top 5 on this site, and every other overclocker's site in the world will buy it.

If I remember right the 8800U had less than a 30% increase over the GTX... without overclocking. Yet if you jack up the voltage and put it under water...

And I wasn't even talking about going THAT far.

As for "who cares about a die shrink"... Maybe you didn't read what I wrote.

Two 8800GTs have a larger die and, especially with stock cooling, generate much more heat. IF (and I bolded, italicized, and underlined that word just in case you miss it) the die shrink leads to significantly lower temperatures it could lead to a dramatic increase in core speed. A 9800GX2 might be clocked roughly the same as an 8800GTX SLI... but if it can overclock to 1000mhz... and run at that speed around 45c...

Then all of a sudden it matters now doesn't it?

this is the part where i say you didnt get what i was talking about... i really dont care about G92. as you seem to indicate im some kind of G92 fanboy...ok.... you take my post in a completly different direction then what it was about.. thats ok...leave the little jabs about someone not being able to to read in your head. thanks.......
 
this is the part where i say you didnt get what i was talking about... i really dont care about G92. as you seem to indicate im some kind of G92 fanboy...ok.... you take my post in a completly different direction then what it was about.. thats ok...leave the little jabs about someone not being able to to read in your head. thanks.......

Sorry to hurt your feelings. Want a Kleenex? :)

But I wasn't talking about you. In fact... I was talking about...Ah! The Geforce 9800GX2. Somehow you must've translated "die shrink" as "fanboy". Easy enough mistake...

I guess the part where you said you didn't care about overclocking was you... But I simply stated that whether or not you care about overclocking (what the hell... where am I again?) has nothing to do with anything.

Then I went on to talk about... Hmm... Oh yeah! That graphics card.

Speaking of which... Somebody brought up an interesting point: Crysis.

Well... I've got an insane theory: Crysis is full of ****. That if you had a video card that was exactly twice as powerful as an 8800 Ultra it would STILL run like crap (in terms of price/performance.) There's also a good chance this was done on purpose... because it's the only thing that makes sense. ANYBODY can design a graphics engine more powerful than what todays consumer and prosumer video cards can handle. Nobody does it because that's just stupid (as reflected in the Crysis sales figures.) But what if they had an ulterior motive? What if they were getting a kickback?

Sounds crazy... but it does cover all the bases. Then it wouldn't really matter how many copies of the game were sold. The real question would be how many people could they scare into upgrading?

What if the 9800GX2 (since this whole thread is mostly speculation anyway) could run Crysis with everything on Very High at exactly 60fps steady and that was the only real advantage over an 8800 Ultra?

That alone would sell it... and EVGA or somebody might be kind enough to throw in a copy with the card.

(BTW... I don't really "believe" this because I haven't seen any corporate memos to prove it. But it fits.)
 
Last edited:
This is perhaps the only time I've ever found myself agreeing with you, rainless. ;) I would not be surprised at all if we learn twenty years from now that your theory is exactly what is happening. And actually, it doesn't sound crazy at all. nVidia and AMD are CORPORATIONS. I don't mean corporations in the "bad" sense either. It is the corporation's job to make money. It seems most of the people complaining about companies like Microsoft and Intel fail to realize that. It isn't nVidia's job to just barely turn in a profit for the owner, it is nVidia's job to make as much money as it (reasonably) can. Why should nVidia spend more money on RnD when it will only prevent them from selling more cards in the long run? Look at this from a reasonable angle. Which would you rather have, a profit of about $1m every year with technology nobody even needs, or upwards of $5bil a year with technology right alongside software?
 
This is perhaps the only time I've ever found myself agreeing with you, rainless. ;) I would not be surprised at all if we learn twenty years from now that your theory is exactly what is happening. And actually, it doesn't sound crazy at all. nVidia and AMD are CORPORATIONS. I don't mean corporations in the "bad" sense either. It is the corporation's job to make money. It seems most of the people complaining about companies like Microsoft and Intel fail to realize that. It isn't nVidia's job to just barely turn in a profit for the owner, it is nVidia's job to make as much money as it (reasonably) can. Why should nVidia spend more money on RnD when it will only prevent them from selling more cards in the long run? Look at this from a reasonable angle. Which would you rather have, a profit of about $1m every year with technology nobody even needs, or upwards of $5bil a year with technology right alongside software?

/nod

Take a look at Nasdaq and Intel's, Nvidia's, AMD's stocks. Even Intel lost money on it's shares over the past few months, even with all of their success in new chips.

Don't expect amazing products when the tech market is this bad. These companies are trying to conserve money, not spend more.
 
Sorry to hurt your feelings. Want a Kleenex? :)

But I wasn't talking about you. In fact... I was talking about...Ah! The Geforce 9800GX2. Somehow you must've translated "die shrink" as "fanboy". Easy enough mistake...
die shirnk is a die stirink there is no mistkaing that.. you infact called me a fan boy of G92.. your own words and not missundstandings about it rain...
Stop thinking of the base numbers (which is what GT heads seem be be obsessed with) and think og the overclock:

Smaller Die
Cooler Temps
Larger Overclock

Not to mention the god-awful "worse than an 8800GTS 320" cooling those GTs have.

which is where i got it from... if you didnt mean me then way even say something like that in the first place? only to underline that i am infact going on about how G92 is god or something? you just came off the wrong way thats all...

everything about crysis is what i already figured...
------------------------------------------------------------------------------

NV would be saving money any way by use G92 on a smaller process for the 9800GX2. after all why spend the money and making a new core work when you dont have too.
 
die shirnk is a die stirink there is no mistkaing that.. you infact called me a fan boy of G92.. your own words and not missundstandings about it rain...


which is where i got it from... if you didnt mean me then way even say something like that in the first place? only to underline that i am infact going on about how G92 is god or something? you just came off the wrong way thats all...
.

And they talk about MY distorted sense of self-importance. :)

Dude... I wasn't talking about you AT ALL. I was talking about all those basketcases that were posting... hmmm... (I guess it's time to get political) "elsewhere" about how the only difference between the 8800GT and the GTS was the cooler while ignoring that what really counted was the max overclock (actually the gain in synthetic or realworld performance) you got between the two cards.

If I recall correctly that little debate had absolutely nothing to do with you, and GT heads (unless you want to imply that you're more than one person) still doesn't sound much like "Evilizer" to me. And according to your sig: You don't even HAVE an 8800GT.

I didn't "come off" wrong. You GOT IT wrong.

But can we just move on?
 
This sucks for me. I bought an 8600gts in august for 2 reasons, 1 being the 9xxx series was right around the corner and decided to save some money (ofcourse I kinda ended up spending that money on cooling for the second reason, which was that some people managed to get 1ghz core on it.)
 
This sucks for me. I bought an 8600gts in august for 2 reasons, 1 being the 9xxx series was right around the corner and decided to save some money (ofcourse I kinda ended up spending that money on cooling for the second reason, which was that some people managed to get 1ghz core on it.)

Reason number 1 was kind of poor: There's ALWAYS going to be something "just around the corner". But if you wait for it there'll surely be something ELSE just around the corner. So you'll end up waiting for that. Reason number 2 was the same reason *I* almost bought an 8600GTS ;)

In the end there was an awesome deal on my 8800GTS 640 so I bought that. You should always buy the very best thing you can afford. Screw "saving money" because it only winds up costing you money in the long run.
 
This sucks for me. I bought an 8600gts in august for 2 reasons, 1 being the 9xxx series was right around the corner and decided to save some money (ofcourse I kinda ended up spending that money on cooling for the second reason, which was that some people managed to get 1ghz core on it.)

what kind of cooling was needed to overclock the core to 1ghz?
 
what kind of cooling was needed to overclock the core to 1ghz?

I saw someone do it with just a large heatsink and a 120mm fan. Doesn't matter much: I'm sure my 8800GTS could handily defeat the 8600GTS... even at 1000mhz. But it is REALLY cool to see in a screenshot.

I only overclock when I'm benchmarking though, so it wouldn't be worth the trouble or the case space.
 
Back