• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

(G92) 8800GTS 512mb thread

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
To claim a "stable" clock, how long do you guys either run a loop of the first 7 test in 3DMark06 or ATI Tool Artifact testing utility?
I do realize that its a bit of a personal taste, but what should be, say forum standard for this card.
 
Yes, but you may have to make a slight modification for it to recognize that 8800.

and what would that modification consist of? info?


i just installed my 8800 gts 512. anyone overclock using riva tuner>?


where to download latest version anyways>?>
 
To claim a "stable" clock, how long do you guys either run a loop of the first 7 test in 3DMark06 or ATI Tool Artifact testing utility?
I do realize that its a bit of a personal taste, but what should be, say forum standard for this card.

I like to go about 15-25 mins of ATI Tool w/ no artifacts. You can tell if you had an artifact while out of the room b/c the two timers won't match.

The GT, and GTS seem to have a memory avalanche problem that VJ has discovered the weak spot for, so a little additional testing is necessary for total stability. For the GT run 8+ loops of 'Battle of Proxycon' from 3DMark03. If you get through clean you should be pretty good. [For me F.E.A.R also brought out this issue very quickly on the GT.] For the GTS run 8...maybe 10 loops of 'Canyon Flight' from 3DMark05. Again, get through clean, and you should be rock-solid.
 
Can someone help me understand what I'm giving up if I should decide to replace my 512MB 8800GT SLI setup for a single 512 8800GTS?

I see plenty of people getting around 16k with a single 512MB 8800GTS & that's what I got with my GT in SLI. they bench the same, but is the real world performance going to be the same?

does it only depend on what resolution i want to run? anything else?
 
I've got an XFX 8800GTS 512 Alpha Dog and the card is incredible. I love the thing so much. It also overclocks like a beast. Stock, the card is 650/1625/972 (core/shader/memory) and now it's running at 800/1900/1075. My 3DMark score is held back by my CPU, but I still posted 11,447 3DMarks. These cards are amazing.

But 3.2 ghz should not be a real bottle neck right?
 
Can someone help me understand what I'm giving up if I should decide to replace my 512MB 8800GT SLI setup for a single 512 8800GTS?

I see plenty of people getting around 16k with a single 512MB 8800GTS & that's what I got with my GT in SLI. they bench the same, but is the real world performance going to be the same?

does it only depend on what resolution i want to run? anything else?

There is absolutely no reason to do that. It would be a step downhill.

Because the CPU score is included in the 3DM06 overall score you can not go by what any
body else's system does in it. CPU/System speed has a huge impact on the 3DM06 overall
score. Just having a Intel C2Q is worth 1200-1400 more overall points over a same clocked
C2D just from the better CPU scores the quad gives.

You can only go by relative scores on the same (your) machine.

Viper
 
Can someone help me understand what I'm giving up if I should decide to replace my 512MB 8800GT SLI setup for a single 512 8800GTS?

I see plenty of people getting around 16k with a single 512MB 8800GTS & that's what I got with my GT in SLI. they bench the same, but is the real world performance going to be the same?

does it only depend on what resolution i want to run? anything else?

umm looking at your sig, you system should be getting much higher scores than 16k. I get 15.5k with a single GT! i looks like one of your cards isn't doing anything!
 
Can someone help me understand what I'm giving up if I should decide to replace my 512MB 8800GT SLI setup for a single 512 8800GTS?

I see plenty of people getting around 16k with a single 512MB 8800GTS & that's what I got with my GT in SLI. they bench the same, but is the real world performance going to be the same?

does it only depend on what resolution i want to run? anything else?

Just stay w/ what you have, and if you are looking for better 3DM06 scores then OC your CPU & GPUs more. As has been stated increasing your CPU OC won't have any real-world effect for games, but it will boost your score if you're looking for bragging rights.

I got over 16k w/ 1 GTS, but I also had my Quad at 3.8GHz when I got that score. I was getting 14.8k w/ 1 GT in my rig at 3.6GHz CPU.
 
thanks guys. the thing is, I have to RMA my XFX 680i mobo, but zipzoomfly no longer sells this mobo. They said my RMA could be for a refund only.

I also have to RMA one of my GT cards with Dell. They actually said I could exchange or return one or both GT cards for no fees.

So, this situation has me considering entirely different mobo/VGA builds - eg p35/x38 with a single card (GTS 512 or Ultra).

The kicker is I have to decide today or tomorrow as my RMA window is running out.

anyway, i've been thinking about getting abit ip35 pro, maximus formula, or x38 DQ6 at this point. I guess I should take one more look at the evga 780i...

thanks again gentlemen :beer:
 
In that case I would send back the 680i, send back the GT's, and pocket the extra loot! I'm not a big SLI fan right now, though.

Get yourself a good GTS (eVGA), and you might want to look into the Foxconn MARS P35 MoBo. It looks like a contender, but I haven't looked into it too much.
 
8800 gt

Picked up a BFG 8800 gt oc. Was quite impressed with the performance. Even running on Vista, scores are pretty good in 3dmark. I play at 1600x1200 res in COD4 most of the time and even with AA at x4 it runs very smooth which is more than I can say for my radeon hd3870's in crossfire. AA kills the performance, and without it the game looks like crap.
 

Attachments

  • 05 with 8800 gt.jpg
    05 with 8800 gt.jpg
    167.1 KB · Views: 535
Last edited:
in case anyones interested, on my XFX
756 x 1890 / 1061 @ 60% fan speed gives me 54C idle and 72C load

with an hr-03 GT installed, and a silverstone 92mm fan on low ( not all the way down but cannot hear low ) and ICD7 diamond paste my idle temps went to 31C idle and 41C load.

same load, same ambient temp.
 
Last edited:
Here are my results from stock speeds n stock cooler vs OC and HR-03 GT.

Stock (650c / 1625s / 972m): Idle-47 Load-69
3DMark06 12258
2.0 = 5631
3.0 = 5682
CPU = 2794

OC (760c / 1900s / 1062m): Idle-40 Load-59
3DMark06 12815
2.0 = 5797
3.0 = 6229
CPU = 2792

Does this look right? Or is my suspicion of my CPU holding it back correct?

Also the temps look a little high...
 
Here are my results from stock speeds n stock cooler vs OC and HR-03 GT.

Stock (650c / 1625s / 972m): Idle-47 Load-69
3DMark06 12258
2.0 = 5631
3.0 = 5682
CPU = 2794

OC (760c / 1900s / 1062m): Idle-40 Load-59
3DMark06 12815
2.0 = 5797
3.0 = 6229
CPU = 2792

Does this look right? Or is my suspicion of my CPU holding it back correct?

Also the temps look a little high...
yeah, i'd wager the cpu is definitely the limiting factor. my cpu score is around 4500 with a q6600 @ 3.2GHz.
my scores on page 2
 
Here are my results from stock speeds n stock cooler vs OC and HR-03 GT.

Stock (650c / 1625s / 972m): Idle-47 Load-69
3DMark06 12258
2.0 = 5631
3.0 = 5682
CPU = 2794

OC (760c / 1900s / 1062m): Idle-40 Load-59
3DMark06 12815
2.0 = 5797
3.0 = 6229
CPU = 2792

Does this look right? Or is my suspicion of my CPU holding it back correct?

Also the temps look a little high...


It's the CPU. Your scores are right on par with a GT with those clocks.
 
It's the CPU. Your scores are right on par with a GT with those clocks.

I figured as much.

A little update though, I flipped the heatsink around, found out I didn't use enough paste last time, but now idle temps are 34 degrees and load is 42! Also have it at 795 core / 1987 Shader / 1062 Memory without artifacts!

:D:D:D
 
Back