Good mid-level card – Brian
SUMMARY: A fairly good mid-level DX9 graphics card.
The topics I’ll cover in this article will discuss packaging, accessories included, technical specs, and some performance numbers. I’m not going too deep into the last one, as
I’ve got planned an in-depth article following this, and the nVidia card review (the other card Newegg sent), to cover performance.
I spent several days benchmarking this card, a FX5700 Ultra, and my old GF4 Ti4200, on two systems (AMD and Intel), using eight different benchmarks ranging from older DX7
(Microsoft Direct X 7) to DX9 level programs, as well as a pair of overall system benchmarks, to see how much those numbers are effected by the graphics card used. The same hard drive
(Western Digital WD400BB) and memory (1GB (2 x 512MB) Corsair XMS PC3200C2) were used in all the runs, and the drive was formatted and a fresh install of Windows XP Pro was done between each
card. I’ll hit a couple highlights from all of that here, but get in depth in the upcoming three way comparo.
Lets start off by taking a look at what you’ll get if you buy one of these cards first.
Sharp eyes will notice this is the 128MB version of the card. Sharper eyes will notice the bundled software includes the latest “TOMB RAIDER” game. While I must give credit where it is due,
and say it’s good to see them actually include a game, rather than a “coupon” for one that’s supposed to come out (eventually..), they might have chosen a better one. Although this version of the
TR series contains the best graphics to date, awkward controls and game play hurt it’s overall enjoyment. Frankly, the highlight of it is Ms. Croft’s overstuffed shirt bounding across your monitor
screen, but I digress…that’s another issue, for another day.
Sapphire was proud enough of their heatpipe cooling setup to make a window in the rear of the box so it can be seen. I’m not too keen on this for a pair of reasons. First off, I’d prefer a picture
on the box myself, and have the card packed in an anti-static bag. Packing it this way also leaves only the shrinkwrap between that heatpipe assembly and the outside of the box. The Sapphire emblem
on the heatpipe is right against it, in fact. It seems a bit risky (my opinion) to leave it that exposed, but then I’m not a shipping engineer, so if they feel it’s safe like this….
Looking at the list of “features” on the right side of the window above, make note of the fifth one down; “Gaming Ram Sinks for great overclockability”. As we look into the box, and closer at the card
on the next page, I’d like to inquire where exactly those are? (If you see them, please send me an e-mail, I couldn’t find them…)
Lets look at what’s included in the box next.
A quick inventory of the included items:
- The card itself (not pictured here)
- DVI to VGA adapter
- S-video to TV cables
- Software bundle, including (from top); “TOMB RAIDER – Angel of Darkness”, Power DVD, Overclocking utility disk, Driver disk
- Graphics Controller: RADEON ™ 9600 PRO
- Memory Configuration: 128MB DDR
- Connectors: VGA out(15 Pin D-Sub), TV-Out(S-Video), DVI Connector
- Bus Interface: AGP 1X/2X/4X/8X
- Maximum Resolution: 2048 x 1536 @ 85Hz
- Video-Out: Supported (TV Tuner and Video-in N/A)
- Clock Speed: 400MHz
- Memory speed: 600MHz
- Operating Systems Support: Windows ® XP/2000/NT/ME/98
- 3D Acceleration Features Supports: DirectX ® 9.0, the latest OpenGL ® functionality, SMARTSHADER ™ 2.1, SMOOTHVISION ™ 2.1, VIDEOSHADER ™, and ATI’s new FULLSTREAM ™ technology
Looking closer at the card, and that heatpipe assembly shows how it’s mounted, and wraps around to the back side of the card:
At this point I’d like to reiterate that question I had asked on the first page; “Gaming ram sinks?? Where?” The shot above clearly shows bare BGA memory chips. This facet must
fall under the “Specs subject to change” clause. For those interested, the ram chips are labeled: “Samsung 340 / K4D263238E-GC2A / WV1034PYA Korea”.
Lets mount it up and take it for a spin next.
While I tested this in two systems, I’ve included the picture above for one important reason. The NF7-S motherboard has a clearance issue with physically large graphic cards.
If the card is too much longer than the AGP slot itself, it runs directly under the ends of the memory slots. If it does, changing memory sticks for any reason also requires
removing the AGP card. This card here is short enough where this isn’t an issue.
However, note the backside part of the heatpipe assembly in relationship to the northbridge
chip (under the smaller square waterblock). If you have a large custom cooling arrangement set up here, it might not clear the heatpipe. Something else to consider… the
waterblock here clears fine, but a larger aftermarket or custom actively cooled heatsink may or may not.
I had no clearance issues in my Intel (Abit IC7 based) system at all, and that one does sport a very oversized northbridge heatsink/fan arrangement. If that northbridge
cooler was installed on the NF7-S above, it wouldn’t even be close to clearing it.
In installing Windows, and drivers, and running the system afterward (both systems, actually…AMD and Intel), I had one reoccurring problem with this card. Often, when booting
the system, and every time when I would have to reboot/restart (for example, after installing drivers, etc…), I would lose video signal. When booting, the initial POST screens would display,
and when the Windows XP splash screen would come up, I’d get a blank screen. When rebooting, the POST screens wouldn’t even display. I’d have to shut the power off completely, and let
the machine sit for a minute or two, and then try again.
Ordinarily, I’d lean towards this being a heat related problem, but it would happen occasionally on stone cold starts as well.
I did some searching of the OC Forums, for posts of anyone else having issues like these with 9600 Pros, and I did find one person who was having the same issues.
So, it doesn’t seem to be a “common” problem, but an isolated issue. What was causing it, I can’t speculate on, but it made it an absolute bear trying to set this card up, when after the majority of
driver installs, downloading DX9.0b, XP SP1, etc, would require a reboot after installation.
I’d like to think the way it’s packaged in the box isn’t related to this.
But, when it got into Windows, the card did run quite well.
In the process of benchmarking this card (and the GF FX5700 Ultra, as well), I was somewhat surprised at the results. For some reason, frankly, I expected more than I got.
Comparatively, in DX7 and DX8 testing, both of them scored lower than I thought they might, and in some cases, even lower than my 4x AGP/DX8 only GF4 Ti4200.
I spent a couple days trying to figure out why this might be, checking and rechecking every BIOS and Windows settings, and even trying different driver versions. Nothing helped. Enabling and/or
disabling various settings (sideband addressing, for example), didn’t help (or hinder) the scores I was getting. When I shifted to newer DX9 specific benchmarking, both cards did perform very well,
and easily outpaced the Ti4200.
Again, I’m not going to delve too deeply into this here, as I’ve got a substantial amount of data, and I plan on putting it into a separate article comparing the three cards I tested.
The two systems I tested this card in are as follows:
- Abit NF7-S v2.0 motherboard
- AGP aperture in BIOS set to 256MB, 8X mode
- AMD XP2100+ (Thoroughbred “B”), running at 2200MHz (200 x 11)
- 2 x 512MB (1GB total) Corsair XMS PC3200C2 DDR, running in Dual Channel Mode, 1:1 ratio (200 FSB)
- Western Digital WD400BB hard drive (ATA100, 7200 RPM, 40GB)
- 1024 x 768 screen resolution, 75Hz refresh rate throughout every test.
- Windows XP Pro, SP1
- ATi Catalyst v3.10 graphics driver
- Abit IC7 v1.0 motherboard
- AGP aperture in BIOS set to 256MB, 8X mode
- Intel P4 1.6A (Northwood), running at 2400MHz (150 x 16)
- Same DDR as above, running in Dual Channel, 4:5 ratio (187FSB)
- Same hard drive as above
- Same screen resolution and refresh rate as above
- Windows XP Pro, SP1
- ATi Catalyst v3.10 graphics driver
All benchmarking was done at 1024 x 768, 75 Hz. The default settings were used with each individual program. For example, 3D Mark runs at a 16 bit depth, with triple buffering,
16 bit Z-Buffering, and Texture formatting. 3D Mark 2001SE uses 32 bit depth and 24 bit Z-Buffering for default settings. Whatever came up when the programs were first run, is how they were run.
I ran each of them a minimum of three times through, and kept the best (highest) score. The ShaderMark v2.0 benchmark doesn’t give a “total” score like the rest I used, but rather a
complete breakdown of each individual test within the benchmark itself. A screenshot of this in Notepad from each system is included below.
|3D Mark 2000|
|3D Mark 2001SE|
|3D Mark 2003|
|Unreal Tournament 2003 Flyby|
|Unreal Tournament 2003 Botmatch|
|PC Mark 2002 CPU|
|PC Mark 2002 Memory|
|PC Mark 2002 HDD|
|PC Mark 2004|
A number of things are evident, looking at these numbers above. Despite the higher total CPU speed (2400MHz vs 2200MHz), the scores from the Intel system were slightly lower than
the AMD machine, with one glaring exception. The Memory score from PC Mark 2002. In this benchmark, the Intel system walloped the AMD rig handily. I’d be real curious to see
what that score might become with a CPU that will get the ram I have to run at spec (200 FSB) or higher.
I found it curious that it ran as it did in DX7 and DX8 benchmarks and applications. I was getting similar framerates from this card and my Ti4200 when playing UT2003 (DX8 based).
With the Ti4200, I’m able to max out the settings fully at 1024 x 768, 32 bit, and only rarely do I come across times that the framerates drop below 35 – 40 FPS. In the DX7 benchmark
used (3D Mark 2000 v1.1), the 9600 Pro actually scored lower than the Ti4200.
It cleaned the GF4’s clock in the DX9 benchies though (as did the FX5700).
As far as really comparing this card to the 5700 Ultra or Ti4200, I’m going to cover that area comprehensively in another article very soon.
As mentioned, the video cutting out on booting definitely took a lot of the fun out of testing this card out. In researching it, it seems to be an isolated problem, however.
With the results I got above, unless you’re upgrading from a GF2/3 level card or older ATi model, I’d find it hard to move towards one of these. I tried everything I could think of, and then
did some reading online for more ideas, but nothing I did seemed to bring the DX7 and 8 benchmark scores higher. The same held true with the 5700 Ultra.
If there’s a DX9 game out there you really want, this card will run it well. This is where they both shined.
All in all, this is a pretty good mid-level card. In a Pixels for Dollars world, how high you want to go with each is up to you. There’s better, but it’ll cost you. But this model definitely
holds its own.
I’d like to thank Newegg for letting us test this out.