• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

8800GT 3DMark score

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
Yep nice, hopely this card will be a value one.

Now isn't nvidia increasing stream processors on GTS becasue of performance level of GT?
 
Hmm well if this is the case then it looks like I'm going to be getting one of these rather than a 8800gts, as long as there is a difference greater than 50-75 dollars in price. If not then I might as well just get the new version of the 8800gts with the increase in performance that they'll be getting.
 
that score is with a 6600 at 3.6 though, if a review site was running it with a stock x6800 or a e6850, basically a cpu of around 3.0 it would get around 10k. Pretty much most of the benches we see from reputable sites go along those lines dont they ?

Still 12k with 3.6 on a 6600 is pretty sweet, especially if the card is stock clocks....not many people can break 14k on air with a gtx and a dually. I think i was just shy with mine overclocked as far as everything would go. If this card grabs 12k and its stock then whoever said see ya later to the gts and gtx is probably right.

We'll have to see how games are though. I can tell you one thing benches dont mean *****. I have had two 320 gts and two gtx 768 cards and now i am playing with a 2900xt 512.
with rig in sig at 3.3 and card stock I got 19953 in 05 and in vista64 to boot. Thats almost a joke, If I oc'd to 3.8 and left the gtx 768 stock I hit around 19k, we all know which card is better though...This card isnt bad, i wouldnt say it has worse IQ then the 8800's but its somehow different. Also it plays bioshock way better than a 320 gts will. Seems to really shine on this game, all settings in games maxed at 1650x1050 and 4xAA and 8xAF I get same frames or better than the gts 320 with no aa and af and the other settings the same, maybe even a little more fps average.
 
Last edited:
Fact is the 8800GT will preform like the current 8800GTX but be cheaper than the GTS ... :cry:

BTW is this with the PCI-E 2 tech or still old
 
Last edited:
At first I thought someone left off a "S" or a "X"

I figured it would do a bit better. Also it's 512MB? 256bit...

END OF THE 8800GTS'S DAMMIT. (A2 core anyway)

Hazaro's OVERCLOCKED A2 CORE (Rig in sig) said:
3DMark Score 11593 3DMarks
SM 2.0 Score 5228 Marks
SM 3.0 Score 5019 Marks
CPU Score 3015 Marks

8800GT rig (Stock said:
3DMark Score 12072 3DMarks
SM 2.0 Score 5431 Marks
SM 3.0 Score 5232 Marks
CPU Score 3148 Marks

Oh and they look way more streamlined...
 

Attachments

  • 02-GF8800GT.jpg
    02-GF8800GT.jpg
    48.2 KB · Views: 313
yeah but those scores once again are with a 3.6 e6600 and the card is stock 600/1800 with 1500mhz shader so if you brought that cpu down to yours hazaro it would score probably 10k.
 
Also it plays bioshock way better than a 320 gts will. Seems to really shine on this game, all settings in games maxed at 1650x1050 and 4xAA and 8xAF I get same frames or better than the gts 320 with no aa and af and the other settings the same, maybe even a little more fps average.

http://www.tweakguides.com/Bioshock_5.html said:
Antialiasing Support: By default BioShock does not support any Antialiasing (AA) to smooth out jagged lines. The reason the game doesn't support AA natively is because it uses a form of rendering called Deferred Lighting, which is supposed to be incompatible with Antialiasing in DX9. It may be possible to allow AA under DX10 according to the developers of the Unreal Engine 3, but this will likely require a patch and if this occurs I will update this section accordingly. In any case If these 'jaggies' bother you, you can attempt to force AA to be applied in Bioshock, though note that you will experience a decrease in performance, and you may experience graphical glitches. The procedure is not completely straightforward, so read the following to determine how to do it for your system:

# Antialiasing can only be forced in BioShock through the graphics card control panel in both Windows XP and Windows Vista, but only in DX9 mode, not in DX10, and also only on GeForce 8 or ATI X1X00 series cards or newer. Note that in Vista, "proper" DX9 mode will have to forced via the -DX9 switch (See DirectX 10 Detail Surfaces further below for the reason).

# For Nvidia users, make sure to use the 163.44 Forceware or newer which are specifically designed for BioShock compatibility. In Windows XP you can force AA through the Forceware Control Panel as normal, but in Windows Vista you will need to go to the \Program Files\2K Games\BioShock\Builds\Release directory and rename the BioShock.exe to R6Vegas_Game.exe. Then right-click on your BioShock launch icon, select Properties and in the Target box add -dx9 one space after the end of the line and click OK. It should look something like this:

"E:\Program Files\2K Games\BioShock\Builds\Release\R6Vegas_Game.exe" -dx9

# For ATI users, you can only force AA in XP or Vista if you rename your Bioshock.exe file to Oblivion.exe, make sure Catalyst AI is activated in your Catalyst Control Center, and then right-click on your BioShock launch icon, select Properties and in the Target box add -dx9 one space after the end of the line and click OK, similar to the example above.

Are you running in DX9 mode? I'm just curious if there is a way to run the game in DX10 mode w/ AA yet. I installed VISTA (dual boot w/ XP-32) to run this game w/ the DX10 eye-candy, but I had to give up AA support.
 
nope vista 64, i swear I can see the difference with AA and AF enabled in my drivers. maybe I am wrong. maybe it just looked better without AA on the 2900xt than it did on the 8800 gts I had, I swear the edges are smooth as hell.
 
That's what I thought too, but when I went from say 16Q AA to 4 AA (forced in nVidia CP) I noticed no gain in FPS, so I dug deeper, and discovered this. You can set the AF in the .ini file, or force it, but I think the game defaults to 4x AF, and I don't like to have conflicting settings. Either disabled in the .ini file and forced via CP, or set via the .ini file for me. I currently have it set to 8x AF via the .ini file.
 
I haven't tried that, but according to the write-up you have to force DX9 mode for that to work. DX10 or AA, but not both according to the article, and anything I can find online.
 
Hmm, I might upgrade this spring to one of these. They do look indeed very promising, but I've got a bad feeling that the 256bit bus might hurt the performance once you turn on eye candy.

dan
 
With no GFx card right now , I have been drooling over this and the new ATI launch Mid nov needs to hurry up and get here allready.
 
if that score is right on xs a stock clocked 8800gt pullin 12k in 06... thats the end of the 8800gts and gtx

Yah. but thas how you know its not true. they would not release a "cheaper" version of the same card with and it scores higher. Some driver hacking going on there.

256bit memory controller? Dont hte GTS have 384 or sometihng?
 
Yah. but thas how you know its not true. they would not release a "cheaper" version of the same card with and it scores higher. Some driver hacking going on there.

256bit memory controller? Dont hte GTS have 384 or sometihng?

gts has 320bit memory buss... gtx has 384... and neur0mancer your right, they wouldnt release a card thats cheaper than ran better.

you guys are gettin all worked up over this card, wait for actual ppl to get it and get benchies out, cus i can GUARENTEE you that the 8800gt is NOT going to be better than the "new" 8800gts with 112sp's or even the "old" 8800gts w/ 96sp's. I mean its going to be priced LESS suppossedly than current stuff... if it performes better than a redicolously clocked 8800gts and c2d wouldnt you think that it would be priced MORE than the 8800gts?

i mean hell with my c2d @ 3.4ghz and my 8800gts 640 @ 630/2000 with 320bit memmory bus i can only pull liike 11k... how is a card thats clocked at 600/1800 with LESS memory and LESS memory bandwidth going to beat something with better specs?

ONLY way i could even remotly see the 8800gt beating a 8800gts would be some MASSIVE MASSIVE renovations on the core of the thing... and i think all they did was really do a die shrink.
 
Last edited:
Back