• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

GeForce 9900 GTX & GTS Slated For July Launch

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
these two points taken together imply to me that a 9900GTX is the same bang for the buck as a current 9800GX2.

9800GX2 is 128gb/s for $600

am i missing something? shouldn't these better or cheaper than that?

The GX2 is really only 64GB/s. The bandwidth doesn't double when you go SLI which is how this card works.
 
The GX2 is really only 64GB/s. The bandwidth doesn't double when you go SLI which is how this card works.

that's lame. i guess evga is basically lying on their product material then.

Memory
1024 MB, 512 bit DDR3
2000 MHz (effective)
128 GB/s Memory Bandwidth

shouldn't they have to take that **** down if it's really only half that? how do they get away with saying **** like that? i guess it's kinda like the car mfgs putting mpg on the sticker that you will never see. why did i think more of vga manufacturers than auto makers? doh! :beer:
 
that's lame. i guess evga is basically lying on their product material then.



shouldn't they have to take that **** down if it's really only half that? how do they get away with saying **** like that? i guess it's kinda like the car mfgs putting mpg on the sticker that you will never see. why did i think more of vga manufacturers than auto makers? doh! :beer:

It's kinda true. It gets 64mb/s per GPU, for a combined 128mb/s. It's also kinda a 1GB card, 512MB per GPU.. I don't care how they market the thing, it's the fastest card money can buy and even Crysis doesn't slow it down at the resolution I play at :p
 
Bandwidth is more important than latency for video cards iirc, not that poor latency is a good thing. GDDR5 speeds are so rediculous it will be neat to see cards using it because it will lower the total cost, the PCBs will be much simpler and the GPU memory controller too, and increase performance.

These are going to be the $500-600 flagships so I'm interested in a purely academic manner, the NV naming schemes have been screwed up for every card in the last year. Some competition would finally be good.

this is the idea gfx card manufactures need to get in their head... and it looks like nvidia has at least for the time being... why not have the best of both worlds... IE good tight timings, with high bandwidth, the way to acheive this is via a wider buss, gddr5 3500 on a 256bit bus is the same as gddr3 1750 on a 512bit buss... all things the same the gddr3 variant will have slightly better performance because data doesnt have to pause to essiently get onto the ram. or off...
 
man i cant wait to upgrade from my 8800gtx to a 9900gtx :)

this 512bit memory bandwidth sounds so awesome.

i cant wait to see crysis benchmarks :)
 
man i cant wait to upgrade from my 8800gtx to a 9900gtx :)

this 512bit memory bandwidth sounds so awesome.

i cant wait to see crysis benchmarks :)

ya maybe one will be able to play crysis at a normal 22inch lcd rez (1680x1050) @ ultra high with playable fps.... maybe some AA, but i think im getting ahead of my self:beer:
 
ya maybe one will be able to play crysis at a normal 22inch lcd rez (1680x1050) @ ultra high with playable fps.... maybe some AA, but i think im getting ahead of my self:beer:

my computer is able to play on everything high no aa at 1680x1050 and get average of 30 fps

on a few maps it dips but its still playable. i find crysis one of those games that is playable at low fps.

i know a few games that if they dip to 45 or below its living hell
 
Glad to see nVidia going for the higher interface bandwidth since they are sticking with DDR3. But as well even when ATI had the 512-bit interface, what true benifits did it hand out? Looking at that and a 256-bit interface there seems to be little veriation at least with ATI's cards, hopefully on nVidia's they not going to be charging us for something that shows little improvement.
 
512bit bus right now just adds alot more cost to a card so it isnt needed, ati higher bus i i recall was to help with AA and such, taking a far less hit?
 
Nvidia is smart to make the expensive move to a bigger memory bus now I suppose. Later they can start putting in faster memory if needed. But I'm not sure when we're going to need such insanely high memory bandwidth, ATI is most likely sticking to the 256bit bus to save money and waiting to go to a higher bus when it's really needed.

I'm not sure how high the cap in bandwidth is on the 256bit bus though.
 
I dont see why people say the 256 bit is enough now. The benchmarks clearly show high-rez AA will choke these "new" refresh cards. 512 bit is the way to go.
 
AA is moot if the best video cards still can't do crysis on full detail / no AA / 1600x1200 (and higher resolutions) and get over 60fps.

Without having to go into something rediculous like 3-4 SLI or quadfire

AA is just awful in principle, it kills gpu's and doesn't really make it look THAT much better. AA is a tool that will drive very costly high bit memory buses, aka Nvidia.

Yeah someone will probably say " omg yes AA does make it look that much better " - yeah sorry but not at the cost of framerate, which is the main issue.

However, a bigger memory bus will be good for monitor scaling. Pretty soon 22" widescreen LCD's are going to be the standard resolution and most people will be at 1920x1200 with how fast LCDs are coming down in price.
 
Last edited:
the only time i use AA is when i can afford to loose the FPS... IE playing bf2 @ 1280x1024 @ 100hz at alll max settings.... at thoes settings with my rig FPS are way higher than 100fps (i run vsync) so i turn aa on to make everything look pretty....

AA is a great invention in my book... it gets rid of the jaggies i personally dislike with a passion.
 
AA? how about just high video memory games try to play company of heroes on a 9800gtx with everything set as high as it goes except AA. 8800 GTX will have better fps. AA is awesome though try playing call of duty with 3x super sampling looks very real other than the super low FPS.
 
AA is moot if the best video cards still can't do crysis on full detail / no AA / 1600x1200 (and higher resolutions) and get over 60fps.

Seriously, who caers about crysis (obviously some people and i think far too many at that), coud it be poorly coded or not done very well optimized, thus why it is the only game to play so badly on almost any setup.

just cause something wont run crysis, doesnt make it a bad product
 
i run at least 2xAA in all my games and my GTS does it just awesome in anything other than crysis at my native res of 1680x1050... but who cares about crysis. I played it been there done that. Now if a game came out that was worth playing more than once and had some kind of replayability to it and not to mention a good multiplayer then that would be a different story. Crysis is one of the smoothest running games at low fps. it runs well even at 45fps but it sucks so who cares.
 
Most likely these NV cards will scream and sell for higher price, NV won't make it midrange card until a die shrink so they can afford the bigger size, and later with DDR5 rams and higher clocks...
 
Back