• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

eVGA GeForce 6800 GT System Performance

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

DarkMatter13

Member
Joined
Jul 4, 2004
Location
New York, United States
Below are overall performance numbers as posted by the machine in my Sig. I include CPU and Memory scores as a frame of reference with which to see the performance scalability of the GF68 series of video cards.

All testing was done on Nvidia WHQL driver series 61.36, with the following settings:
AntiAliasing: Off
Anisotropic Filtering: Off
Vertical Sync: Off
Trilinear Optimizations: Off
Imaqe Quality: High Quality

ALL SCORES ARE THE WEIGHTED AVERAGE OF THREE INDEPENDENT RUNS OF ALL TESTS

Intel Pentium IV at 3.0 GHz w/ 800 MHz FSB| GeForce 6800 GT at 350 MHz/500 MHz DDR

Sandra Scores:
1) Arithmetic: Drystone: 9063 MIPs
Whetstone: FPU/iSSE2: 3794/6606 MFLOPs
2) Multimedia: Integer iSSE2: 23,066 IT/s
Floating Point iSSE2: 32,982 IT/s
3) Memory Bandwidth: Integer Buffered: 4848 MB/s
Floating Point Buffered: 4860 MB/s

3DMark2003 Build 340
1)3DMarks: 10,975
2)CPU Marks: 806

Intel Pentium IV at 3.0 GHz w/ 800 MHz FSB | GeForce 6800 GT at 420 MHz/575 MHz DDR

Sandra: All previous scores apply

3DMark2003 Build 340:
1) 3DMarks:12,459
2) CPU Marks: 810

Intel Pentium IV at 3.75 GHz w/ 1000 MHz FSB | GeForce 6800 GT at 350 MHz/500 MHz DDR

Sandra:
1) Arithmetic: Drystone: 11,310 MIPs
Whetstone: FPU/iSSE2: 4705/8351 MFLOPs
2) MultiMedia: Integer iSSE2: 28,874 IT/s
Floating Point: 40,045 IT/s
3) Memory Bandwidth: Integer Buffered: 5,433 MB/s
Floating Point Buffered: 5,353 MB/s


3DMark2003 Build 340:
1) 3DMarks: 11,214
2) CPU Marks: 939

Intel Pentium iV at 3.75 GHz w/ 1000 MHz FSB | GeForce 6800 GT at 420 MHz/575 MHz DDR

Sandra: All previous scores apply

3DMark2003 Build 340:
1) 3DMarks: 12,863
2) CPU Marks: 941

It seems the 61.36 WHQL drivers seem to level off the performace of the 6800 GT to a sort of flat mid-range plateau. They result in a roughly 250 3DMark drop from my high of 13,104 using drivers 61.71, which can be viewed HERE .

The eVGA Geforce 6800 GT, overall, is an excellent video card. It's performance is directly proportional to the system specs to which it is applied. If anyone was looking to buy this card, but was unsure of compatability/scalability of performance, I hope this has gone towards helping to make a decision, in one direction or another.

HAPPY GAMING!!!

DarkMatter13
 
Wow. I have to stop going to these forums. Everytime I look at them I want the 6800 GT more and more. I don't have anymore time for games!
 
I just ordered a BFG 6800 GT OC about an hour ago, I keep thinking I should cancel my order do to lack of funds but if I keep seeing numbers like this I don't think i will. That's an awesome overclock btw.
 
Have you had any problems with these dirvers running FarCry or any other games. The 61.36's kept locking up on me in FarCry, although my 3dmark03 # of 10,969 was fairly close to your results.

PS. Nice report. :)
 
No, I haven't had any lockup issues as of yet. I had a few random reboots originally, which required some voltage tuning. The 6800 and the OC P4 draw A LOT of juice. I did have a serious problem after uninstalling the 61.71s and trying to install the 61.36s. I actually had to do a fresh install to solve it. I'm not sure if it was cross corruption from driver files that didn't remove from the 61.71s though. I haven't installed Far Cry again yet, as I just got back up and running, and the first order of business was putting together the review/benchmark. I have reinstalled Max Payne 2 though, and haven't seen anything. Not even so much as a hiccup. Best of luck sorting out your issue, and I'm sorry I couldn't be any more help.
 
DarkMatter13,

what bothers me is that the more I see people with the 6800gt/u cards...
the more I relise that a faster system might not help much.

you seemed to gain, ONLY, 400 marks from your CPU OC...
that's far differant from what I'm seeing from the ATI cards.

is your real world gaming getting any better with the OC ?
(in other words, do you feel that your card has a lot of performance left in it?)

mica

(this is not aimed at getting a bunch of flames, this is for information only.)
 
well i could give you a my experiences with the 6800gt, i don't know if you were aware of my problems when i got the card, but while trying to figure out the problem, i dropped my OC to default to see if my cpu oc was the problem and it was, i ran a 3dmark03 run with a stock 2800+ thats 166fsb @2.09ghz, i got ~10,900. then i set it to the OC in my sig again and adding .025V to the core, i got ~11,200. this was with 3dmark03 build 420, the scores i posted in my thread were build 320

micamica1217 said:
DarkMatter13,

what bothers me is that the more I see people with the 6800gt/u cards...
the more I relise that a faster system might not help much.

you seemed to gain, ONLY, 400 marks from your CPU OC...
that's far differant from what I'm seeing from the ATI cards.

is your real world gaming getting any better with the OC ?
(in other words, do you feel that your card has a lot of performance left in it?)

mica

(this is not aimed at getting a bunch of flames, this is for information only.)
 
bobmanfoo said:
well i could give you a my experiences with the 6800gt, i don't know if you were aware of my problems when i got the card, but while trying to figure out the problem, i dropped my OC to default to see if my cpu oc was the problem and it was, i ran a 3dmark03 run with a stock 2800+ thats 166fsb @2.09ghz, i got ~10,900. then i set it to the OC in my sig again and adding .025V to the core, i got ~11,200. this was with 3dmark03 build 420, the scores i posted in my thread were build 320

thanks, it seems that you too got only about 300 more marks.

the reason I'm asking is that I only tested my rig at 3.67ghz...this is my every day OC on my CPU.
when going back to stock, I get 500 marks less.

I'm starting to think two things:

1) the 6800 cards do a lot better in 3dmark03 then the x800 cards....
no matter what the real world gaming is like.

2)the x800 cards might be held back by slow CPU/system speed.
(when going from a 2.4b, OCed @ 2.9ghz, I got 1000 less marks then @ 3.4ghz...I never realised the true performance of my non OCed 9700pro till then.)

I'm wondering if a faster system will wake up the new cards or not. :confused:

mica
 
Well, my personal take on it is this. The sheer on-die capabilities of the 6800 series make the GPU core itself the major workhorse of the entire AGP subsystem. HOwever, it appears that as relative bus speed climbs, the 6800 series performance curve climbs as well, albeit on a mostly flat curve. I attribute this, in all likelihood, and as far as I can see in any case, to the increased bandwidth of the AGP data path from CPU, to RAM, to AGP.

In the case of the GPU core itself, increased clocks result in definitive performance enhancement. As witnessed in comparison of stock/stock clockspeeds to ideal OC in the case of CPU/GPU. The GPU core overclock to 420 MHz is a roughly 20% clock frequency enhancement. This yields what amounts to a 15% increase in GPU rendering performance. That is nearly a 1:1 ratio of frequency overclock to performance gain (i.e. 1.3 MHz frequency overclock on the core yields 1% increase in performance over stock speeds). As I mentioned in another thread, the "Detect Optimal Frequencies" option in CoolBits sets the clocks at 429/1200 respectively. However, at that clock rate, performace actually suffers. I attribute this to the intentional underpowering of the 6800 GT chipset. I think that if one could provide the 6800 GT with more voltage, she would continue on the neat 1:1 performance increase curve for another 25 to 30 MHz. So, in answer to your question, do I think the chip itself has more room to stretch it's legs buried inside? Yes, with no question. Do I think I will be able to coax it out of the chip without physical modification of the card? No, not at all. Am I willing to alter the voltage regulatory architecture of the card, thus utterly and completely rendering null and void any and all warranty on a $400 item? I think that answer is perfectly clear, it's a resounding no.

As for real world gaming performance, as I said this was after a fresh install from a nasty bout I had with the remnants of the 61.71 driver package, and with the exception of Max Payne 2, I haven't really had a chance to reinstall any games. I will however, as soon as I have the chance, do some gaming, and will try and bring some results to the table for discussion.

Just to be clear, I am not a "fanboy" in any case, neither Nvidia nor ATI. In fact, the first high end AGP card I ever purchased was a Powercolor ATI Radeon 9700 Pro. The card itself caused me severe compatibility issues with the then-current SiS chipset Asus motherboard I was using. Since then I cannot bring myself to spend the money on an ATI graphic chip. The purpose of this post was not to proclaim the 6800 series the "Champion of All Gaming Realms and Master of All Rendered Polygons". In fact, it was intended simply to show the performance of this machine solely, and if it helped anyone to come to a decision upon which card to buy, so much the better. I fail to see any huge disparity between either of the new series of AGP cards. Fanboys, everyday gamers, or collectors of hot hardware alike would be equally satisfied with any of the X800 or GF68 series.

DarkMatter13
 
DarkMatter13,

what I'm trying to say is....
do you think that with a faster CPU/system (say you got your chip to 4.2ghz), do you think your card will give you a far better score in 3dmark03, or real world gaming?

with a system as fast as yours, do you think this 6800gt still has life left in it, if you got a much faster rig?

mica
 
micamica1217 said:
DarkMatter13,

what I'm trying to say is....
do you think that with a faster CPU/system (say you got your chip to 4.2ghz), do you think your card will give you a far better score in 3dmark03, or real world gaming?

with a system as fast as yours, do you think this 6800gt still has life left in it, if you got a much faster rig?

mica

Mica,
Yes, I actually do. As I noted in the opening paragraph of my initial response: " ....as relative bus speed climbs, the 6800 series performance curve climbs as well, albeit on a mostly flat curve. I attribute this, in all likelihood, and as far as I can see in any case, to the increased bandwidth of the AGP data path from CPU, to RAM, to AGP."

Mind you, it would not be as substantial as direct GPU/GDDR overclocking, which I also noted, but it does positively affect performance. As for whether or not I would achieve a "far better score" simply through the overclock of the FSB, I don't believe so no. It's in the coupling of the graphics chip overclock, and the FSB overclock, and resulting increased data throughput where the gain lies.

So, I guess in summation I would say that barring a miraculous release of a 4 GHz+ CPU tomorrow, the performance curve of the GF68 series is only limited to how far you are willing to take your efforts to wring every last nugget of performance from every single gate and molecule of silicon in the chipset. As for me, I have hit the extent of what I am willing to risk, and am 100% satisfied.

I hope this has answered your question mica. :beer:
 
DarkMatter13 said:
I hope this has answered your question mica. :beer:

I think it has. thanks.
yet I'm thinking that we are not even seeing the true speed of the x800pro at my OCed speeds yet....
only time will tell. (and a faster system.)

mica
 
micamica1217 said:
thanks, it seems that you too got only about 300 more marks.

the reason I'm asking is that I only tested my rig at 3.67ghz...this is my every day OC on my CPU.
when going back to stock, I get 500 marks less.

I'm starting to think two things:

1) the 6800 cards do a lot better in 3dmark03 then the x800 cards....
no matter what the real world gaming is like.

2)the x800 cards might be held back by slow CPU/system speed.
(when going from a 2.4b, OCed @ 2.9ghz, I got 1000 less marks then @ 3.4ghz...I never realised the true performance of my non OCed 9700pro till then.)

I'm wondering if a faster system will wake up the new cards or not. :confused:

mica

Keep in mind that that the x800 series is simply an "upgrade" of the R360/R350 cores that appeard in the 9800 Pro whereas the 6800 series is based on architecture that is completely different than those of the NV35 series (GFX 57/59/5950). That is what most likely accounts for the large difference in 3DMark 2003: the GF 6 was built for the sole purpose of hanling nothing but the newest 3D apps and models. However, I am in no way trying to :mad: flame :mad: the x800 series as it has its own merits. I'm just saying that this difference is a result of two very different visions between ATi and Nvidia.
 
deception`` said:
Keep in mind that that the x800 series is simply an "upgrade" of the R360/R350 cores that appeard in the 9800 Pro whereas the 6800 series is based on architecture that is completely different than those of the NV35 series (GFX 57/59/5950). That is what most likely accounts for the large difference in 3DMark 2003: the GF 6 was built for the sole purpose of hanling nothing but the newest 3D apps and models. However, I am in no way trying to :mad: flame :mad: the x800 series as it has its own merits. I'm just saying that this difference is a result of two very different visions between ATi and Nvidia.

while I agree that the 6800's are based on a new core...
to say that the x800 is based on, or just an upgrade to the 9800pro is far from the truth.

it's closer to an upgrade to the 9600xt core, then the 9800's....
yet with far more vertex shaders, and pipes, it's not even in the same ballpark as the 9600's.

now as to the 3dmark03 scores:
please note, that while the 6800gt/u cards are doing better then the x800pro/xt cards in 3dmark03...
they seem not to be able to handle the real games as well as a x800pro.

I would say that the 3dmark03 scores (on the 6800gt/u) have more to do with drivers then the chip itself....IMHO.

mica
 
I think that it is almost impossible to declare an inherent "winner" amongst this round of video cards, which is essentially a good thing. Most earlier reviews would have you think that it's the x800 series hands down, but over time the benchmarks and thoughts have shown that the two cards are in fact very much equal in many ways. Also, my own experiences have taught me that many benchmarks should be taken with a grain of salt. Hell, two sites could have the exact same setups and somehow I think they'd never come up with the same scores (especially if one was Tom's Hardware :D). Anyway, I think that personally preference will determine what the user buys - whether that be brand loyalty, differences in features (such as SM 3.0), or system requirements (such as the sketchy posts of GF 6xxxx power consumption).
 
deception`` said:
I think that it is almost impossible to declare an inherent "winner" amongst this round of video cards, which is essentially a good thing....

Oh, I not only agree, but I'm in no way declaring a winner at this time.

my thoughts were just based on my last experiance with the 9700pro (at stock)...
that seemed to wake up, and give me 1000 more marks in 3dmark, when I went from 2.9ghz to 3.4ghz, with a 2.4b CPU.

I'm just wondering, thats all...no more, no less.

mica
 
UPDATE:...

It appears my power supply is either unwilling or unable to continue to happily provide both the Northwood and the 6800 GT with the massive amounts of power each in their own respects so wants and loves. As a result I have been forced to return the CPU to stock clock speeds until I can either

a) find a happy medium CPU/RAM clock at 1:1
b) find an equally happy clock where the 6800 GT and the Northie decide to play nice
c) scrounge up my pennies and purchase a new PSU

(note to self: never buy another Aspire Turbo PSU, they only last about a month. :bang head )
 
http://www.guru3d.com/article/Videocards/135/3/

Sounds odd that this would happen seeing as guru was able to run it with a prescott off a 380W no problems. However, it could very well mean your psu is s***. I'd suggest a Fortron 530W Psu which is only 75 at newegg and has adjustable pots.

On a side note I think it's a shame that people spend so much on cpu/mobo/vid card/ram but tend to overlook the importance of a good psu. A few extra dollars here and there in the beginning can often save much more money/time/headaches in the long run.

Check out the Fortron here: http://www.newegg.com/app/viewProductDesc.asp?description=17-104-968&depa=0
 
Last edited:
Back