• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

My 5900XT shows little improvement

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

kihjin

New Member
Joined
Jul 6, 2004
I just upgraded my system a few days ago. Here's what I used to have:

PNY Geforce FX 5200 Ultra
Corsair Value Select 256 MB DDR pc2100
Crucial 256 MB DDR pc2100

Now I have:

Gainward Geforce FX 5900 XT (Ultra/1100 "Golden Sample")
Corsair XMS pc3200 512 MB DDR

What I still have is:

AMD Athlon XP 2100+ (Running at 1.74ghz)
Epox 8RDA+


Okay. I had to increase the memory voltage to 2.63v, otherwise I would experience freezes. memtest86 has completed 3 passes (of all 12 tests) successfully without any errors. I've got the memory running at DDR400 (200 mhz frequency) with timings: 6-3-3-2.

I have 8x AGP enabled in BIOS, and the AGP frequency locked in at 66. My AGP voltage used to be at 1.5v, but, during game play, I'm subjected to system crashes. I'm thinking it needs to be higher. With it all the way up, I experienced no crashes during game play. I was unable to find the standard setting, google, newegg, or anywhere.

So, onto the focus of my post. I'm experiencing little improvement from my original setup. Please review the screen shots I've taken below.

http://kyle.frozenonline.com/QPGLS859SJ/

I've got the directory set so apache indexes them, so just go there. Even at 800x600, my FPS does not increase all that much. Also realize, that the screen shots only show when my FPS was low. In other situations, (small rooms, for instance, or tunnels) it may go up to 100, 125, etc.

Drivers? Yes. I am using the latest. The 'nvidia.png' screen shot shows the details.

So I'm wondering if there's some sort of limiting factor that's causing the problem I'm experiencing. I was under the impression that 5200 ultra-> 5900XT would produce some noticeable improvement. Right now I'm not seeing much at all.

Any ideas? Help? Thanks.
 
This always happens. I have a problem, so I wait a while to do anything about it. Then, I'll set out to solve it, and not be able to. So I'll post to a forum, hoping someone else can help.

Before anyone is able to help though, I solve it all myself. Which is what I did. What was wrong, you ask?

Well, the "nvidia.png" image sums it up fairly well. If you notice, it says the BUS is "PCI". That doesn't sound right.

I ended up removing two lines from my XF86Config file: Option "NVAGP" and and Option "NODCC", and voila. Awesome FPS now. Woo.

But, the voltage I should use still eludes me. I have the following to chose from: 1.5, 1.6, 1.7, 1.8, 1.9.

Thanks.
 
Something is definately setup wrong. Admittedly I ran at around 3.7Ghz on my 5900xt but it would just break 20g in 3d2001 and over 6g in 3d 2003. Xbit labs (I believe) vmodded one and got it up top of the class in the Orb at 27G on 3d2001. A very good card for the money, especially when it came out. I have since seen the price on at least some of them go up.
 
Well, yes. As I said in my previous post, I was able to discover the problem and fix the issue.

The Nvidia Chipset drivers (which includes the AGPGART module) is included in the newer kernels. About 6 months ago, prior to the release of Kernel 2.6, I had the 2.4.* kernel series, whose nvidia support was lacking critically and the solution was to use the NvAGP option within the XF86Config file.

Now, I use Kernel 2.6.7 (well, I've been using it since it was released, a few weeks ago), I already had the Nvidia chipset drivers enabled and compiled in, except, I forgot to remove the NVAGP (and the NODCC) option lines from the XF86Config file. Basically, my X server was not using the AGP hardware to communicate with the card, so, my bandwidth speeds were significantly reduced, even though I had AGP capability.

Look here at the images prefixed with new_ (compared to the non-prefixed ones):
http://kyle.frozenonline.com/QPGLS859SJ/

Granted that, as I asked in my previous post, I do not know what voltage I should have my card set to. I have the following to choose from in BIOS: 1.5, 1.6, 1.7, 1.8, 1.9 volts.

Also, my card is rated to run at 450mhz core, 780 mhz memory. However, such does not seem to be the case:

Code:
[frozen@home ~/srcs/nvclock/src]$ ./nvclock -i
NVClock v0.7

-- General info --
Card:           nVidia GeforceFX 5900XT
PCI id:         0x332
GPU speed:      300.857 MHz
Bustype:        AGP

-- Memory info --
Amount:         128 MB
Type:           256 bit DDR
Speed:          702.000 MHz

-- AGP info --
Status:         Enabled
Rate:           8X
AGP rates:      4X 8X 
Fast Writes:    Disabled
SBA:            Enabled

The GPU speed does go up to 390 MHz when I'm running a 3d game. 300 Mhz seems to be the operating 2d-mode. However, the memory speed does not increase at all, even though it should run at 780 MHz (according to the details listed at Newegg). The GPU should actually run at 450 MHz, but it does not.
 
Last edited:
On the AGP I normally run 1.6v unless I am trying to tweak out the last bit on a 3d run. 1.6 should suffice.
 
Back